Sep 29 09:44:33 crc systemd[1]: Starting Kubernetes Kubelet... Sep 29 09:44:33 crc restorecon[4623]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:33 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 09:44:34 crc restorecon[4623]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 29 09:44:35 crc kubenswrapper[4922]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.187196 4922 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193474 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193502 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193508 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193513 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193517 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193522 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193527 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193532 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193536 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193541 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193546 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193551 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193555 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193559 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193564 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193571 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193575 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193580 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193584 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193588 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193592 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193596 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193599 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193604 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193609 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193615 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193620 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193626 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193630 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193635 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193639 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193644 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193648 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193653 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193657 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193662 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193666 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193671 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193675 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193680 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193685 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193690 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193694 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193698 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193701 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193705 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193709 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193712 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193717 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193723 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193728 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193732 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193737 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193741 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193746 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193751 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193757 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193763 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193777 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193782 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193786 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193790 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193794 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193797 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193801 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193805 4922 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193809 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193813 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193817 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193820 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.193843 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195171 4922 flags.go:64] FLAG: --address="0.0.0.0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195186 4922 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195195 4922 flags.go:64] FLAG: --anonymous-auth="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195200 4922 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195206 4922 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195211 4922 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195217 4922 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195222 4922 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195226 4922 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195231 4922 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195235 4922 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195240 4922 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195244 4922 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195248 4922 flags.go:64] FLAG: --cgroup-root="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195252 4922 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195256 4922 flags.go:64] FLAG: --client-ca-file="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195260 4922 flags.go:64] FLAG: --cloud-config="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195265 4922 flags.go:64] FLAG: --cloud-provider="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195269 4922 flags.go:64] FLAG: --cluster-dns="[]" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195274 4922 flags.go:64] FLAG: --cluster-domain="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195278 4922 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195282 4922 flags.go:64] FLAG: --config-dir="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195286 4922 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195291 4922 flags.go:64] FLAG: --container-log-max-files="5" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195296 4922 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195300 4922 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195304 4922 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195309 4922 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195313 4922 flags.go:64] FLAG: --contention-profiling="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195317 4922 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195321 4922 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195326 4922 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195330 4922 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195335 4922 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195339 4922 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195343 4922 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195347 4922 flags.go:64] FLAG: --enable-load-reader="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195351 4922 flags.go:64] FLAG: --enable-server="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195355 4922 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195360 4922 flags.go:64] FLAG: --event-burst="100" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195365 4922 flags.go:64] FLAG: --event-qps="50" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195369 4922 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195373 4922 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195377 4922 flags.go:64] FLAG: --eviction-hard="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195382 4922 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195386 4922 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195390 4922 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195394 4922 flags.go:64] FLAG: --eviction-soft="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195398 4922 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195402 4922 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195407 4922 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195411 4922 flags.go:64] FLAG: --experimental-mounter-path="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195415 4922 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195420 4922 flags.go:64] FLAG: --fail-swap-on="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195424 4922 flags.go:64] FLAG: --feature-gates="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195428 4922 flags.go:64] FLAG: --file-check-frequency="20s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195433 4922 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195437 4922 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195441 4922 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195445 4922 flags.go:64] FLAG: --healthz-port="10248" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195449 4922 flags.go:64] FLAG: --help="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195454 4922 flags.go:64] FLAG: --hostname-override="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195458 4922 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195462 4922 flags.go:64] FLAG: --http-check-frequency="20s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195466 4922 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195472 4922 flags.go:64] FLAG: --image-credential-provider-config="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195476 4922 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195481 4922 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195485 4922 flags.go:64] FLAG: --image-service-endpoint="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195490 4922 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195494 4922 flags.go:64] FLAG: --kube-api-burst="100" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195498 4922 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195502 4922 flags.go:64] FLAG: --kube-api-qps="50" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195506 4922 flags.go:64] FLAG: --kube-reserved="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195510 4922 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195514 4922 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195519 4922 flags.go:64] FLAG: --kubelet-cgroups="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195523 4922 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195527 4922 flags.go:64] FLAG: --lock-file="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195530 4922 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195534 4922 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195538 4922 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195545 4922 flags.go:64] FLAG: --log-json-split-stream="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195549 4922 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195553 4922 flags.go:64] FLAG: --log-text-split-stream="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195557 4922 flags.go:64] FLAG: --logging-format="text" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195561 4922 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195565 4922 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195569 4922 flags.go:64] FLAG: --manifest-url="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195573 4922 flags.go:64] FLAG: --manifest-url-header="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195579 4922 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195583 4922 flags.go:64] FLAG: --max-open-files="1000000" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195587 4922 flags.go:64] FLAG: --max-pods="110" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195592 4922 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195596 4922 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195600 4922 flags.go:64] FLAG: --memory-manager-policy="None" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195604 4922 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195608 4922 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195613 4922 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195617 4922 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195626 4922 flags.go:64] FLAG: --node-status-max-images="50" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195630 4922 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195634 4922 flags.go:64] FLAG: --oom-score-adj="-999" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195639 4922 flags.go:64] FLAG: --pod-cidr="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195643 4922 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195649 4922 flags.go:64] FLAG: --pod-manifest-path="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195653 4922 flags.go:64] FLAG: --pod-max-pids="-1" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195658 4922 flags.go:64] FLAG: --pods-per-core="0" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195662 4922 flags.go:64] FLAG: --port="10250" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195666 4922 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195670 4922 flags.go:64] FLAG: --provider-id="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195674 4922 flags.go:64] FLAG: --qos-reserved="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195678 4922 flags.go:64] FLAG: --read-only-port="10255" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195682 4922 flags.go:64] FLAG: --register-node="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195686 4922 flags.go:64] FLAG: --register-schedulable="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195690 4922 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195700 4922 flags.go:64] FLAG: --registry-burst="10" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195704 4922 flags.go:64] FLAG: --registry-qps="5" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195708 4922 flags.go:64] FLAG: --reserved-cpus="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195712 4922 flags.go:64] FLAG: --reserved-memory="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195718 4922 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195722 4922 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195726 4922 flags.go:64] FLAG: --rotate-certificates="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195730 4922 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195734 4922 flags.go:64] FLAG: --runonce="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195738 4922 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195742 4922 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195746 4922 flags.go:64] FLAG: --seccomp-default="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195792 4922 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195797 4922 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195802 4922 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195806 4922 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195810 4922 flags.go:64] FLAG: --storage-driver-password="root" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195814 4922 flags.go:64] FLAG: --storage-driver-secure="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195819 4922 flags.go:64] FLAG: --storage-driver-table="stats" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195823 4922 flags.go:64] FLAG: --storage-driver-user="root" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195839 4922 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195843 4922 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195847 4922 flags.go:64] FLAG: --system-cgroups="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195855 4922 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195862 4922 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195866 4922 flags.go:64] FLAG: --tls-cert-file="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195870 4922 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195874 4922 flags.go:64] FLAG: --tls-min-version="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195878 4922 flags.go:64] FLAG: --tls-private-key-file="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195882 4922 flags.go:64] FLAG: --topology-manager-policy="none" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195886 4922 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195890 4922 flags.go:64] FLAG: --topology-manager-scope="container" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195894 4922 flags.go:64] FLAG: --v="2" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195900 4922 flags.go:64] FLAG: --version="false" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195906 4922 flags.go:64] FLAG: --vmodule="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195910 4922 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.195915 4922 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.199807 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200059 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200069 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200078 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200086 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200095 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200107 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200118 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200128 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200166 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200175 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200184 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200192 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200200 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200208 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200215 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200224 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200231 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200239 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200247 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200254 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200262 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200270 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200279 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200286 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200294 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200302 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200310 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200318 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200325 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200333 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200342 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200350 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200358 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200369 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200379 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200389 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200399 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200408 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200417 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200428 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200438 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200447 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200456 4922 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200464 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200472 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200480 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200488 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200496 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200504 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200511 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200519 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200527 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200534 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200542 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200550 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200560 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200569 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200578 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200586 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200594 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200602 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200609 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200618 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200626 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200633 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200641 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200649 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200657 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200665 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.200673 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.200687 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.210108 4922 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.210364 4922 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210419 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210426 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210430 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210434 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210438 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210442 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210446 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210450 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210453 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210457 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210460 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210464 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210467 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210471 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210474 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210478 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210481 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210485 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210489 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210493 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210498 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210502 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210506 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210511 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210515 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210519 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210522 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210526 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210530 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210534 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210538 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210543 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210547 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210552 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210556 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210561 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210565 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210569 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210573 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210576 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210580 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210584 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210587 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210591 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210595 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210598 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210602 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210606 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210609 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210613 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210618 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210622 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210626 4922 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210630 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210634 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210638 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210641 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210645 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210648 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210652 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210655 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210659 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210662 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210665 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210669 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210673 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210677 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210682 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210686 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210690 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210694 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.210700 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210804 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210810 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210814 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210819 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210823 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210842 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210849 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210856 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210860 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210865 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210870 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210875 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210879 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210883 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210887 4922 feature_gate.go:330] unrecognized feature gate: Example Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210890 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210894 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210897 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210901 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210905 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210909 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210913 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210917 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210921 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210924 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210928 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210931 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210935 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210939 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210942 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210946 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210949 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210954 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210958 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210962 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210965 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210969 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210972 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210977 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210981 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210984 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210987 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210992 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.210997 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211002 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211006 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211010 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211014 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211017 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211021 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211025 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211029 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211033 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211036 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211040 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211044 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211048 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211051 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211055 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211058 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211062 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211065 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211069 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211072 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211076 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211080 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211083 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211087 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211090 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211093 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.211097 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.211103 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.211228 4922 server.go:940] "Client rotation is on, will bootstrap in background" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.215347 4922 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.215418 4922 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.217168 4922 server.go:997] "Starting client certificate rotation" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.217188 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.217477 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 06:45:52.446120175 +0000 UTC Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.217560 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2445h1m17.228564575s for next certificate rotation Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.260192 4922 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.262906 4922 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.286094 4922 log.go:25] "Validated CRI v1 runtime API" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.328389 4922 log.go:25] "Validated CRI v1 image API" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.330824 4922 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.338126 4922 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-29-09-40-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.338171 4922 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.367180 4922 manager.go:217] Machine: {Timestamp:2025-09-29 09:44:35.363075035 +0000 UTC m=+0.729305379 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:33c12f62-5b5f-4d4e-9af7-92ce6ab7df30 BootID:8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:da:5b:8f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:da:5b:8f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ff:1a:3d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:26:82:7a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4d:9c:8f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:77:0b:77 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:0a:88:04:cf:bf Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:bf:e7:67:f2:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.367742 4922 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.368132 4922 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.368635 4922 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.369081 4922 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.369144 4922 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.369563 4922 topology_manager.go:138] "Creating topology manager with none policy" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.369592 4922 container_manager_linux.go:303] "Creating device plugin manager" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.370411 4922 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.370478 4922 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.371447 4922 state_mem.go:36] "Initialized new in-memory state store" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.371598 4922 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.376477 4922 kubelet.go:418] "Attempting to sync node with API server" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.376520 4922 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.376558 4922 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.376580 4922 kubelet.go:324] "Adding apiserver pod source" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.376598 4922 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.382456 4922 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.384127 4922 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.385096 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.385149 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.385227 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.385230 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.389945 4922 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391903 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391937 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391944 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391950 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391964 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391972 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391980 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.391992 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.392001 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.392009 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.392052 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.392065 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.393147 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.393662 4922 server.go:1280] "Started kubelet" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.394522 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.394683 4922 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.394724 4922 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.395098 4922 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.395900 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.395935 4922 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.395993 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:14:09.847791636 +0000 UTC Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.396082 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1094h29m34.451715456s for next certificate rotation Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.396111 4922 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.396125 4922 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.396159 4922 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 09:44:35 crc systemd[1]: Started Kubernetes Kubelet. Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.396206 4922 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.404685 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.111:6443: connect: connection refused" interval="200ms" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.404747 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.404808 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.405252 4922 factory.go:55] Registering systemd factory Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.405299 4922 factory.go:221] Registration of the systemd container factory successfully Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.405747 4922 factory.go:153] Registering CRI-O factory Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.405792 4922 factory.go:221] Registration of the crio container factory successfully Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.406018 4922 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.406074 4922 factory.go:103] Registering Raw factory Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.406109 4922 manager.go:1196] Started watching for new ooms in manager Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.407471 4922 manager.go:319] Starting recovery of all containers Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.407617 4922 server.go:460] "Adding debug handlers to kubelet server" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.411246 4922 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.111:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869b7ab3357db54 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-29 09:44:35.393575764 +0000 UTC m=+0.759806028,LastTimestamp:2025-09-29 09:44:35.393575764 +0000 UTC m=+0.759806028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415724 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415809 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415865 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415892 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415918 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415943 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415968 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.415992 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416018 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416042 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416067 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416090 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416113 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416143 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416168 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416192 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416219 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416245 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416273 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416300 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416326 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416353 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416381 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416407 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416433 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416459 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416492 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416521 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416547 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416648 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416674 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416703 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416769 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416811 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416873 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416904 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416932 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416956 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.416981 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417008 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417032 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417058 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417087 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417119 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417147 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417172 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417199 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417225 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417253 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417279 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417364 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417395 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417433 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417463 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417493 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417521 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417550 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417575 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417601 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417626 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417651 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417676 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417702 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417766 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417794 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417820 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417907 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417940 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417966 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.417990 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418014 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418037 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418061 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418088 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418115 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418143 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418169 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418195 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418221 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418249 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418308 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418347 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418391 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418415 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418443 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418468 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418496 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418521 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418547 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418574 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418599 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418624 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418648 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418674 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418698 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418725 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418750 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418775 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418805 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418869 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418903 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418928 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418953 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.418975 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419003 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419049 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419071 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419090 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419113 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419133 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419154 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419174 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419193 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419217 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419245 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419283 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419319 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419346 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419368 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419384 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419401 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419419 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419436 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419454 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419472 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419488 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419506 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419525 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419545 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419562 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419582 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419599 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419617 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419635 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419653 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419670 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419689 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419707 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419730 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419881 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419909 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419931 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419949 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419966 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.419984 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420004 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420023 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420041 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420060 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420078 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420095 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420113 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420131 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420148 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420165 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420184 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420204 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420220 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420239 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420258 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420274 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420292 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420310 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420328 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420347 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420365 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420382 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420400 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420416 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420433 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420451 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420468 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420486 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420503 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420522 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420541 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420559 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420576 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420594 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420613 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420630 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.420648 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423336 4922 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423383 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423406 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423425 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423444 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423466 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423484 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423501 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423519 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423536 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423554 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423571 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423590 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423607 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423623 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423640 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423656 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423674 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423692 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423709 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423726 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423742 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423758 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423777 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423795 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423812 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423861 4922 reconstruct.go:97] "Volume reconstruction finished" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.423880 4922 reconciler.go:26] "Reconciler: start to sync state" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.425151 4922 manager.go:324] Recovery completed Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.434239 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.435792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.435910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.435967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.436898 4922 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.436992 4922 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.437072 4922 state_mem.go:36] "Initialized new in-memory state store" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.448536 4922 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.450395 4922 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.450465 4922 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.450501 4922 kubelet.go:2335] "Starting kubelet main sync loop" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.450560 4922 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.453961 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.454029 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.458347 4922 policy_none.go:49] "None policy: Start" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.459141 4922 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.459164 4922 state_mem.go:35] "Initializing new in-memory state store" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.496678 4922 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.514503 4922 manager.go:334] "Starting Device Plugin manager" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.514617 4922 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.514638 4922 server.go:79] "Starting device plugin registration server" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.515261 4922 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.515284 4922 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.515948 4922 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.516068 4922 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.516079 4922 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.524684 4922 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.551565 4922 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.551743 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554314 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554502 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.554561 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556753 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.556997 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557032 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557660 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557881 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.557910 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558910 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558933 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558943 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.558999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.559011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560558 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.560618 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.561755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.561797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.561814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.605508 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.111:6443: connect: connection refused" interval="400ms" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.616000 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.617142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.617174 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.617182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.617202 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.617587 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.111:6443: connect: connection refused" node="crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.625928 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626063 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626123 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626182 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626210 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626251 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626283 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626340 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626425 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626459 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.626487 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.727992 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728279 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728901 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729329 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729547 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729728 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729924 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730123 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730474 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730645 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730814 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.731013 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729208 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729966 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730351 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730559 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729608 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730710 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.728771 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.730926 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.729778 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.731116 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.818222 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.825228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.825282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.825310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.825345 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:35 crc kubenswrapper[4922]: E0929 09:44:35.825813 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.111:6443: connect: connection refused" node="crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.879193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.886289 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.905392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.923135 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.924582 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-def34a5148b40bdf92ff4089d74d44821ad9e054b4359a026f712a58375abd1f WatchSource:0}: Error finding container def34a5148b40bdf92ff4089d74d44821ad9e054b4359a026f712a58375abd1f: Status 404 returned error can't find the container with id def34a5148b40bdf92ff4089d74d44821ad9e054b4359a026f712a58375abd1f Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.925582 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1e90d1e7216efbf026add455c8fbd1744329a48d1c34b216da8bf7cd09879a54 WatchSource:0}: Error finding container 1e90d1e7216efbf026add455c8fbd1744329a48d1c34b216da8bf7cd09879a54: Status 404 returned error can't find the container with id 1e90d1e7216efbf026add455c8fbd1744329a48d1c34b216da8bf7cd09879a54 Sep 29 09:44:35 crc kubenswrapper[4922]: I0929 09:44:35.931265 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.932567 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-147fdb03aa514a7eaaa6a14a7f0d991fab8b66566b7fa096e72a32431d3bcb9e WatchSource:0}: Error finding container 147fdb03aa514a7eaaa6a14a7f0d991fab8b66566b7fa096e72a32431d3bcb9e: Status 404 returned error can't find the container with id 147fdb03aa514a7eaaa6a14a7f0d991fab8b66566b7fa096e72a32431d3bcb9e Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.936566 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-52c185eac63b61e0dd757158b20f9fc47223555c26e72f22f63be464929e7c70 WatchSource:0}: Error finding container 52c185eac63b61e0dd757158b20f9fc47223555c26e72f22f63be464929e7c70: Status 404 returned error can't find the container with id 52c185eac63b61e0dd757158b20f9fc47223555c26e72f22f63be464929e7c70 Sep 29 09:44:35 crc kubenswrapper[4922]: W0929 09:44:35.947524 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2357b8e4982dec1cbee4412ad164771e1db50b23b1a5beffa61d7c75ea792ee7 WatchSource:0}: Error finding container 2357b8e4982dec1cbee4412ad164771e1db50b23b1a5beffa61d7c75ea792ee7: Status 404 returned error can't find the container with id 2357b8e4982dec1cbee4412ad164771e1db50b23b1a5beffa61d7c75ea792ee7 Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.016741 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.111:6443: connect: connection refused" interval="800ms" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.225930 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.227602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.227649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.227673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.227701 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.228255 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.111:6443: connect: connection refused" node="crc" Sep 29 09:44:36 crc kubenswrapper[4922]: W0929 09:44:36.290177 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.290308 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:36 crc kubenswrapper[4922]: W0929 09:44:36.314878 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.315022 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.396063 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.457474 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"147fdb03aa514a7eaaa6a14a7f0d991fab8b66566b7fa096e72a32431d3bcb9e"} Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.459634 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1e90d1e7216efbf026add455c8fbd1744329a48d1c34b216da8bf7cd09879a54"} Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.461784 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"def34a5148b40bdf92ff4089d74d44821ad9e054b4359a026f712a58375abd1f"} Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.463553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2357b8e4982dec1cbee4412ad164771e1db50b23b1a5beffa61d7c75ea792ee7"} Sep 29 09:44:36 crc kubenswrapper[4922]: I0929 09:44:36.464769 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"52c185eac63b61e0dd757158b20f9fc47223555c26e72f22f63be464929e7c70"} Sep 29 09:44:36 crc kubenswrapper[4922]: W0929 09:44:36.602783 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.602891 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.818197 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.111:6443: connect: connection refused" interval="1.6s" Sep 29 09:44:36 crc kubenswrapper[4922]: W0929 09:44:36.831569 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:36 crc kubenswrapper[4922]: E0929 09:44:36.831680 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.028504 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.037022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.037157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.037178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.037266 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:37 crc kubenswrapper[4922]: E0929 09:44:37.038237 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.111:6443: connect: connection refused" node="crc" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.395714 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.470958 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.471316 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.471332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.471355 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.471083 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.472497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.472528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.472539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.474286 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb" exitCode=0 Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.474367 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.474530 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.475864 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.475928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.475953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.476876 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91" exitCode=0 Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.477109 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.477128 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.478742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.478761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.478772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.478773 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.479761 4922 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778" exitCode=0 Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.479825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.479868 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.480796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.480945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.480979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.481208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.481239 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.481250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.482945 4922 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7" exitCode=0 Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.482989 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7"} Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.483074 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.483969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.484021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:37 crc kubenswrapper[4922]: I0929 09:44:37.484092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.395986 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:38 crc kubenswrapper[4922]: E0929 09:44:38.419930 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.111:6443: connect: connection refused" interval="3.2s" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491069 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491053 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.491989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.501325 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.501361 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.501377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.501389 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.503299 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb" exitCode=0 Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.503354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.503487 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.504550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.504580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.504591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.507265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a4360e9c68d925f807b7768fd6fc1eccc5b7ee3ab8664cb3783ad84d9b01e55e"} Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.507306 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.507294 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.509026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.509052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.509064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.509305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.510244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.510323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: W0929 09:44:38.581755 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.111:6443: connect: connection refused Sep 29 09:44:38 crc kubenswrapper[4922]: E0929 09:44:38.581887 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.111:6443: connect: connection refused" logger="UnhandledError" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.638852 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.640282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.640344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.640358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:38 crc kubenswrapper[4922]: I0929 09:44:38.640399 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:38 crc kubenswrapper[4922]: E0929 09:44:38.641204 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.111:6443: connect: connection refused" node="crc" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.095660 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.516133 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845"} Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.516271 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523516 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616" exitCode=0 Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523698 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523982 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524023 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.523991 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524205 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524852 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.524796 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616"} Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.525973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.526000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.526014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.526208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.526286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.526307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.527435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.527594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:39 crc kubenswrapper[4922]: I0929 09:44:39.528187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.476539 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.485958 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9"} Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743"} Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531398 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5"} Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531420 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531464 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.531534 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.533734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.533777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.533796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.534678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.534733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:40 crc kubenswrapper[4922]: I0929 09:44:40.534768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.541617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6"} Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.541688 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.541690 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2"} Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.541636 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.542549 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.543551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.568564 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.568757 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.570731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.570781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.570797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.815762 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.816109 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.818433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.818496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.818519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.841753 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.844363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.844436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.844462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:41 crc kubenswrapper[4922]: I0929 09:44:41.844504 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.133375 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.223641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.539528 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.543931 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.544053 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.544574 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.545997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.546030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:42 crc kubenswrapper[4922]: I0929 09:44:42.546054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.411090 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.546680 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.546756 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:43 crc kubenswrapper[4922]: I0929 09:44:43.548717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:45 crc kubenswrapper[4922]: E0929 09:44:45.524878 4922 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.088619 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.088903 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.090676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.090728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.090739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.661314 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.661632 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.663299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.663346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.663354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:46 crc kubenswrapper[4922]: I0929 09:44:46.669573 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:47 crc kubenswrapper[4922]: I0929 09:44:47.558021 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:47 crc kubenswrapper[4922]: I0929 09:44:47.559352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:47 crc kubenswrapper[4922]: I0929 09:44:47.559401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:47 crc kubenswrapper[4922]: I0929 09:44:47.559422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:49 crc kubenswrapper[4922]: W0929 09:44:49.115825 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.115956 4922 trace.go:236] Trace[664371471]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:44:39.114) (total time: 10000ms): Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[664371471]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (09:44:49.115) Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[664371471]: [10.00096947s] [10.00096947s] END Sep 29 09:44:49 crc kubenswrapper[4922]: E0929 09:44:49.115982 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 09:44:49 crc kubenswrapper[4922]: W0929 09:44:49.396065 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.396242 4922 trace.go:236] Trace[1861099900]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:44:39.395) (total time: 10001ms): Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[1861099900]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:44:49.396) Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[1861099900]: [10.001182606s] [10.001182606s] END Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.396110 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 29 09:44:49 crc kubenswrapper[4922]: E0929 09:44:49.396284 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 09:44:49 crc kubenswrapper[4922]: W0929 09:44:49.596019 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.596211 4922 trace.go:236] Trace[1275614866]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:44:39.593) (total time: 10002ms): Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[1275614866]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:44:49.595) Sep 29 09:44:49 crc kubenswrapper[4922]: Trace[1275614866]: [10.002227883s] [10.002227883s] END Sep 29 09:44:49 crc kubenswrapper[4922]: E0929 09:44:49.596266 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.661324 4922 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:44:49 crc kubenswrapper[4922]: I0929 09:44:49.661477 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 09:44:49 crc kubenswrapper[4922]: E0929 09:44:49.677897 4922 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.1869b7ab3357db54 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-29 09:44:35.393575764 +0000 UTC m=+0.759806028,LastTimestamp:2025-09-29 09:44:35.393575764 +0000 UTC m=+0.759806028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 29 09:44:50 crc kubenswrapper[4922]: I0929 09:44:50.501341 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 09:44:50 crc kubenswrapper[4922]: I0929 09:44:50.501423 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 09:44:50 crc kubenswrapper[4922]: I0929 09:44:50.510018 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 09:44:50 crc kubenswrapper[4922]: I0929 09:44:50.510065 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.546196 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.546498 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.547820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.547887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.547901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.550449 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.570424 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.570489 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.571719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.571797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:44:52 crc kubenswrapper[4922]: I0929 09:44:52.571812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.354817 4922 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.387242 4922 apiserver.go:52] "Watching apiserver" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.397033 4922 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.397482 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.398804 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.399057 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.399220 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.399390 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:53 crc kubenswrapper[4922]: E0929 09:44:53.399392 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.399434 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.399518 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:53 crc kubenswrapper[4922]: E0929 09:44:53.399534 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:44:53 crc kubenswrapper[4922]: E0929 09:44:53.399902 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.405891 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.406099 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.406335 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.406478 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.408655 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.408900 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.409043 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.409361 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.419039 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.479714 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.494299 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.495344 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.497478 4922 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.498981 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.512510 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.522751 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.532157 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.543472 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.555616 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.565634 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:53 crc kubenswrapper[4922]: I0929 09:44:53.690013 4922 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.451670 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.451966 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.452163 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.452302 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.452371 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.452540 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.467110 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.476679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.492236 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.501713 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.503919 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.506275 4922 trace.go:236] Trace[348956305]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 09:44:42.979) (total time: 12527ms): Sep 29 09:44:55 crc kubenswrapper[4922]: Trace[348956305]: ---"Objects listed" error: 12526ms (09:44:55.506) Sep 29 09:44:55 crc kubenswrapper[4922]: Trace[348956305]: [12.527021334s] [12.527021334s] END Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.506494 4922 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.509061 4922 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.510148 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.526021 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.556527 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41958->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.556588 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41958->192.168.126.11:17697: read: connection reset by peer" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.556646 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41974->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.556792 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41974->192.168.126.11:17697: read: connection reset by peer" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.557136 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.557401 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.557959 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.576807 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.609467 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.609704 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.609779 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.609861 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.609937 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610003 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610131 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610255 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610319 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610384 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610453 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610523 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610587 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610652 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610721 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610788 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610867 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611457 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611504 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611554 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611578 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611602 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611620 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611643 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.610776 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611405 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611431 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613798 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.611839 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612011 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613843 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612237 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612329 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613877 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612474 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612605 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612722 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613910 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613935 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613957 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613982 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614005 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614039 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614067 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614101 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614127 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614220 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614283 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614316 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614348 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614447 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614480 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614545 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614579 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614603 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614635 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614671 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614703 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614763 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614821 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614873 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612760 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612823 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614901 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612936 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.612993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614925 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614953 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614982 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615032 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615058 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615084 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615110 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615142 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615167 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615194 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615219 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615276 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615301 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615330 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615358 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615384 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615485 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615515 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615615 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615645 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615676 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615709 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615735 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615768 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615797 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615820 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615871 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615902 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615949 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615979 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616008 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616052 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616080 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616140 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616167 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616215 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616244 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616270 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616421 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616473 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616535 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616566 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616809 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631284 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631345 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613133 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613160 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613168 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613034 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613306 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.613725 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614090 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614161 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614366 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614551 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.614880 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615140 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615732 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.615935 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616118 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616306 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632066 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616538 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616676 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.616709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.617186 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.617349 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.617385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.618083 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.618445 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.618690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.618916 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.619047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.619261 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.619508 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.619712 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.621042 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.621385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.622092 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.623421 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.623671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.623978 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.624239 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.624885 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.625157 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.625595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.625638 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626190 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626277 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626408 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626496 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626521 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626542 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626645 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626739 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.626975 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.627185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.627667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.628609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.629690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.629798 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.630029 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.630062 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.630265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.630847 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.630871 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631202 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631310 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631421 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631445 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631439 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631812 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.631819 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632589 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632623 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632645 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632673 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632698 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632726 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632747 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632769 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632789 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632812 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632889 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632913 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632935 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.632983 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633005 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633030 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633055 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633081 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633125 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633152 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633172 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633190 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633213 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633233 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633254 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633292 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633316 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633338 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633362 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633387 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633410 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633431 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633452 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633476 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633496 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633520 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.633542 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.636242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.636781 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637162 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637368 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637519 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637687 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.637965 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.638563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639297 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639478 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639415 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639568 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639600 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639623 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639647 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639673 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639700 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639719 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639743 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639766 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639786 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639807 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639849 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639875 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639955 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639989 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640020 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640100 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640125 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640157 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640292 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640325 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640359 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640392 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640788 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640865 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640891 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640948 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640972 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641028 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641055 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641077 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641130 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641153 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641181 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641212 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641240 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641873 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641907 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641933 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641970 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642059 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642502 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642518 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642528 4922 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642538 4922 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642818 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642853 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642866 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642896 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642906 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642916 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642926 4922 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642938 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642949 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643172 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643183 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643197 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643206 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643216 4922 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643226 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643238 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643248 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643258 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643270 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643280 4922 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643290 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643300 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643313 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643324 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643334 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643569 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643583 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643593 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643604 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643613 4922 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643625 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643635 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643644 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639513 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639644 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639848 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644249 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644262 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644369 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.639894 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.640345 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641347 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641356 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641505 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641791 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641894 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641952 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642018 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642238 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.641656 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642394 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642620 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642708 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.642999 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643270 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643390 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643506 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643733 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.643984 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644014 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644602 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644904 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644972 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.644994 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645082 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645448 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645500 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645511 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645519 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645551 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645824 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.645903 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.646486 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.646554 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.646503 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.646714 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.646909 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.647144 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.647254 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:56.147231724 +0000 UTC m=+21.513461988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.647253 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.647647 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.647977 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.648126 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.648851 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.649103 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.649121 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.649640 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650453 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650756 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650804 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650819 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650855 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650869 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650882 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650894 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650909 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650924 4922 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650938 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650949 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650963 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.650974 4922 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651040 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651084 4922 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651396 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651432 4922 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651512 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651556 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651571 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651673 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651693 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651708 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.651810 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.651935 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:56.151914041 +0000 UTC m=+21.518144305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.651747 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.652402 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.652886 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.652954 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653008 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653026 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653043 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653054 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653084 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653096 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653110 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653121 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.653130 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.666397 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.667265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.667283 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.667487 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.667595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.667665 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668264 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668358 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668387 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668420 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668607 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.668422 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669624 4922 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669647 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669668 4922 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669690 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669705 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669721 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669736 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669730 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669750 4922 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669851 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669864 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669875 4922 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669889 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669902 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669903 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.669916 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670039 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670055 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670068 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670083 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670100 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670114 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670129 4922 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670143 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670156 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670169 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670270 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670289 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670283 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670303 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.670919 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.671275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.671698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.671819 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.671965 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.672035 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672128 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672293 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:56.172264015 +0000 UTC m=+21.538494499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672551 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.671976 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.672158 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.671976 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672649 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672717 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.672810 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:56.172780919 +0000 UTC m=+21.539011183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.673019 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.673391 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: E0929 09:44:55.673484 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:44:56.173475458 +0000 UTC m=+21.539705722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.673703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.673993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.674200 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.674692 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.674773 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.674949 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.675218 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.685774 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.686878 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.692717 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.692854 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.693173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.694658 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.694855 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.695225 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.696069 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.712806 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.716798 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.717423 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.726926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771216 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771252 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771312 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771328 4922 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771340 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771352 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771363 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771373 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771384 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771396 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771407 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771418 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771429 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771440 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771450 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771461 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771472 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771482 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771492 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771502 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771513 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771424 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771525 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771587 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771600 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771611 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771621 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771630 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771639 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771648 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771658 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771667 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771677 4922 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771685 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771694 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771706 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771724 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771732 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771742 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771751 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771760 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771769 4922 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771779 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771788 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771842 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771866 4922 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771874 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771883 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771892 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771901 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771909 4922 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771920 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771929 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771939 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771947 4922 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771956 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771966 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771975 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771984 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.771994 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772002 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772011 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772021 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772030 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772039 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772047 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772056 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772065 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772074 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772084 4922 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772094 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772103 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772112 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772121 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772130 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772139 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772147 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772157 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772166 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772174 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772183 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772192 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772201 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772212 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772220 4922 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772229 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772241 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772250 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772259 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772273 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772281 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772290 4922 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772298 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772306 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772315 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772323 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772331 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772341 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.772349 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.783713 4922 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.835432 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.847786 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 09:44:55 crc kubenswrapper[4922]: W0929 09:44:55.848236 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-753cabdc649549651ebfeae5d31991f191826ab515c540ca56ccae97fdac2cca WatchSource:0}: Error finding container 753cabdc649549651ebfeae5d31991f191826ab515c540ca56ccae97fdac2cca: Status 404 returned error can't find the container with id 753cabdc649549651ebfeae5d31991f191826ab515c540ca56ccae97fdac2cca Sep 29 09:44:55 crc kubenswrapper[4922]: I0929 09:44:55.862891 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.178803 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:44:57.178747588 +0000 UTC m=+22.544977852 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.178910 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.179049 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.179141 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179412 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179533 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:57.179522369 +0000 UTC m=+22.545752633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179644 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179688 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179706 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.179766 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:57.179735065 +0000 UTC m=+22.545965329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.179799 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.179870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180016 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180032 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180043 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180110 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:57.180098185 +0000 UTC m=+22.546328449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180188 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: E0929 09:44:56.180216 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:57.180208988 +0000 UTC m=+22.546439252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.581919 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"046b531388b19bcea0beb9c463b3d8f7e2bf580504d000fd4ef405052a8d2cbd"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.585688 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.585772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.585791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5579e899fe1bc430f37f4ebccf29eea7e004f14ec26043607976e2e6f570c5a0"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.590368 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.590432 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"753cabdc649549651ebfeae5d31991f191826ab515c540ca56ccae97fdac2cca"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.593717 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.597626 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845" exitCode=255 Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.597700 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845"} Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.599206 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.615320 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.627613 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.638259 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.655916 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.665714 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.669500 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.682185 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.699768 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.713402 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.726355 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.739643 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.752892 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.767002 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.781225 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.781611 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.781391 4922 scope.go:117] "RemoveContainer" containerID="e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.794414 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.829995 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.991682 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-h6dfk"] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.992138 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h6dfk" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.994290 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.994613 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.994751 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.994915 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.995112 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.998145 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dbcdn"] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.998533 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.999666 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kgzgq"] Sep 29 09:44:56 crc kubenswrapper[4922]: I0929 09:44:56.999943 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.005468 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.005557 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.005719 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.005917 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.009369 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.009782 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.011351 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.011715 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.014497 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tr9bt"] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.015657 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.016379 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-66xg9"] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.017099 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.018662 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.023462 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.025455 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.025597 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.025719 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.026024 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.026098 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.026140 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.026170 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.039243 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.068213 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.085108 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089461 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089552 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-daemon-config\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089574 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089595 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089636 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-bin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089659 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089679 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-kubelet\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-conf-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.089929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090058 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18583652-9871-4fba-93c8-9f86e9f57622-rootfs\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090083 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7p2n\" (UniqueName: \"kubernetes.io/projected/18583652-9871-4fba-93c8-9f86e9f57622-kube-api-access-m7p2n\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090150 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090205 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090232 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8rch\" (UniqueName: \"kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090471 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18583652-9871-4fba-93c8-9f86e9f57622-proxy-tls\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090540 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-system-cni-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090568 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-cnibin\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090615 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090647 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-netns\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-multus\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-etc-kubernetes\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090754 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-os-release\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090772 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cni-binary-copy\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-socket-dir-parent\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090820 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a512521c-5cca-4e12-8e5f-97ba1b42b325-hosts-file\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090858 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090873 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-system-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-multus-certs\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090933 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-hostroot\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090950 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstnh\" (UniqueName: \"kubernetes.io/projected/7dc69012-4e4c-437b-82d8-9d04e2e22e58-kube-api-access-hstnh\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.090983 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091029 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18583652-9871-4fba-93c8-9f86e9f57622-mcd-auth-proxy-config\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091083 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864fc\" (UniqueName: \"kubernetes.io/projected/a512521c-5cca-4e12-8e5f-97ba1b42b325-kube-api-access-864fc\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091210 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gcb\" (UniqueName: \"kubernetes.io/projected/8c029143-44a6-410b-8496-24f92c58bb8f-kube-api-access-v2gcb\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091235 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cnibin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091254 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-os-release\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-k8s-cni-cncf-io\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.091306 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.120599 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.146795 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.162071 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.181242 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-etc-kubernetes\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192425 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-os-release\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.192521 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:44:59.192473426 +0000 UTC m=+24.558703920 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192556 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-etc-kubernetes\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cni-binary-copy\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-socket-dir-parent\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192709 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a512521c-5cca-4e12-8e5f-97ba1b42b325-hosts-file\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192747 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192776 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-system-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-multus-certs\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192842 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-socket-dir-parent\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192910 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-system-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a512521c-5cca-4e12-8e5f-97ba1b42b325-hosts-file\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192946 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-hostroot\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193169 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-multus-certs\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.192938 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-hostroot\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstnh\" (UniqueName: \"kubernetes.io/projected/7dc69012-4e4c-437b-82d8-9d04e2e22e58-kube-api-access-hstnh\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193272 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193303 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18583652-9871-4fba-93c8-9f86e9f57622-mcd-auth-proxy-config\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gcb\" (UniqueName: \"kubernetes.io/projected/8c029143-44a6-410b-8496-24f92c58bb8f-kube-api-access-v2gcb\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193360 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-864fc\" (UniqueName: \"kubernetes.io/projected/a512521c-5cca-4e12-8e5f-97ba1b42b325-kube-api-access-864fc\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193419 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193449 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-os-release\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-k8s-cni-cncf-io\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193507 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cnibin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193560 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-bin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-daemon-config\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193711 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193738 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193800 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-kubelet\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193846 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-conf-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193874 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193911 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cni-binary-copy\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193919 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-cnibin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.193985 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194021 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194032 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194062 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194092 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194141 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194146 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194168 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18583652-9871-4fba-93c8-9f86e9f57622-rootfs\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194193 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194400 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194436 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-cni-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194492 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18583652-9871-4fba-93c8-9f86e9f57622-mcd-auth-proxy-config\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.194500 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.194524 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.194540 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-os-release\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.194590 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:59.194578342 +0000 UTC m=+24.560808806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-k8s-cni-cncf-io\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194640 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-bin\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194853 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c029143-44a6-410b-8496-24f92c58bb8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194944 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194978 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18583652-9871-4fba-93c8-9f86e9f57622-rootfs\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195095 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195192 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195207 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195220 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-conf-dir\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.194194 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7p2n\" (UniqueName: \"kubernetes.io/projected/18583652-9871-4fba-93c8-9f86e9f57622-kube-api-access-m7p2n\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195243 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-kubelet\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195265 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:59.195244841 +0000 UTC m=+24.561475105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195292 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195322 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195377 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:59.195363154 +0000 UTC m=+24.561593638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195384 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195398 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195409 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195415 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-os-release\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.195436 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:44:59.195428516 +0000 UTC m=+24.561658780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195466 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195487 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8rch\" (UniqueName: \"kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18583652-9871-4fba-93c8-9f86e9f57622-proxy-tls\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195531 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-cnibin\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7dc69012-4e4c-437b-82d8-9d04e2e22e58-multus-daemon-config\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195589 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195607 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-cnibin\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195618 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-system-cni-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195714 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-netns\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-multus\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c029143-44a6-410b-8496-24f92c58bb8f-system-cni-dir\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195815 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-var-lib-cni-multus\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195852 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.195792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7dc69012-4e4c-437b-82d8-9d04e2e22e58-host-run-netns\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.196188 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.196965 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.204550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18583652-9871-4fba-93c8-9f86e9f57622-proxy-tls\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.205050 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.213766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gcb\" (UniqueName: \"kubernetes.io/projected/8c029143-44a6-410b-8496-24f92c58bb8f-kube-api-access-v2gcb\") pod \"multus-additional-cni-plugins-66xg9\" (UID: \"8c029143-44a6-410b-8496-24f92c58bb8f\") " pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.217896 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7p2n\" (UniqueName: \"kubernetes.io/projected/18583652-9871-4fba-93c8-9f86e9f57622-kube-api-access-m7p2n\") pod \"machine-config-daemon-kgzgq\" (UID: \"18583652-9871-4fba-93c8-9f86e9f57622\") " pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.218873 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.223337 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstnh\" (UniqueName: \"kubernetes.io/projected/7dc69012-4e4c-437b-82d8-9d04e2e22e58-kube-api-access-hstnh\") pod \"multus-h6dfk\" (UID: \"7dc69012-4e4c-437b-82d8-9d04e2e22e58\") " pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.223621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-864fc\" (UniqueName: \"kubernetes.io/projected/a512521c-5cca-4e12-8e5f-97ba1b42b325-kube-api-access-864fc\") pod \"node-resolver-dbcdn\" (UID: \"a512521c-5cca-4e12-8e5f-97ba1b42b325\") " pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.223670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8rch\" (UniqueName: \"kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch\") pod \"ovnkube-node-tr9bt\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.234925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.248061 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.258629 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.269004 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.280786 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.292222 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.303228 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.307275 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h6dfk" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.314565 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dbcdn" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.322696 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.330076 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.331447 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.340791 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66xg9" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.341877 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.357907 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.368231 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.382224 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.400782 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.424756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.438656 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.453021 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.453066 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.453025 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.453166 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.453249 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.453346 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.458212 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.458949 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.460134 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.460847 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.462163 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.463001 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.463661 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.470671 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.472041 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.472567 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.473151 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.474331 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.475125 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.476097 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.476614 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.477276 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.479802 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.480243 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.481068 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.484787 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.485795 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.487519 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.488217 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.489876 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.490859 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.491873 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.493679 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.496069 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.497714 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.500074 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.501157 4922 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.501389 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.506670 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.508313 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.509356 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.513127 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.517284 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.518642 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.520299 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.522760 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.524111 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.526324 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.527877 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.528675 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.529864 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.530579 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.531696 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.532663 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.533817 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.534522 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.535178 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.536571 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.537428 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.538742 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.603523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerStarted","Data":"c8b869069cb048a8eba74c5f1bb4be9f36659abeb24bdc9f8b4239118b926b05"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.604622 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.604669 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"1143daf9114446710b81ad0d943832ac3988eb2bae8123c048a39d200a1c344f"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.605538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerStarted","Data":"571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.605569 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerStarted","Data":"8a601534bd4074aa203743f4ac042554c3e0cd25ca34e7cd093ba20672fd7a8c"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.609579 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.614312 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.616283 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.619561 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" exitCode=0 Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.619648 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.619671 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"b7010633445d722b5310860b71b80808292b33cf43f584358d3709d3b1ce4e9f"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.624452 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dbcdn" event={"ID":"a512521c-5cca-4e12-8e5f-97ba1b42b325","Type":"ContainerStarted","Data":"c27d282879165a06e6bc282e716188a6fb2a5a2d23e36efce6c0204a7390fb76"} Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.628493 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: E0929 09:44:57.633582 4922 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.645259 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.662194 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.677586 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.701709 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.733603 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.749004 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.763703 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.776306 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.788446 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.806752 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.827397 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.844505 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.860198 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.891042 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.928567 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.967397 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:57 crc kubenswrapper[4922]: I0929 09:44:57.987507 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.018355 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.039332 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.057203 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.074685 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.087978 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.116266 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.139748 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.155839 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.171985 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.183688 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.631172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.636685 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.637066 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.637136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.637157 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.637176 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.637195 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.638743 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dbcdn" event={"ID":"a512521c-5cca-4e12-8e5f-97ba1b42b325","Type":"ContainerStarted","Data":"cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.640490 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935" exitCode=0 Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.640600 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.642654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51"} Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.651415 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.675168 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.688541 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.717396 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.731396 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.746350 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.758593 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.776896 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.792055 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.809880 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.820916 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.835031 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.848148 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.861973 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.880562 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.897126 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.913015 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.924419 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.948625 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.967415 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.982850 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:58 crc kubenswrapper[4922]: I0929 09:44:58.997553 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.018392 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.032055 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.049151 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.062401 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.074260 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.091696 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.219060 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219266 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:45:03.219225071 +0000 UTC m=+28.585455345 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.219382 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.219435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.219481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.219518 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219684 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219688 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219733 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219754 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219798 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219799 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:03.219773756 +0000 UTC m=+28.586004060 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219693 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219889 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:03.219853938 +0000 UTC m=+28.586084192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219903 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219923 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219945 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:03.21991875 +0000 UTC m=+28.586149054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.219982 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:03.219968121 +0000 UTC m=+28.586198425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.451013 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.451214 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.451389 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.452052 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.452184 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:44:59 crc kubenswrapper[4922]: E0929 09:44:59.452237 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.650853 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerStarted","Data":"7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e"} Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.670644 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.694918 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.713687 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.727694 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.744636 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.765076 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.778091 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.796074 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.817244 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.832252 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.844461 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.855920 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.874374 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:44:59 crc kubenswrapper[4922]: I0929 09:44:59.888882 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:44:59Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.657456 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661"} Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.659327 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e" exitCode=0 Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.659362 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e"} Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.678145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.695498 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.715161 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.744559 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.763523 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.778573 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.792226 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.817444 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.837261 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.853108 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.865664 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.877313 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.893099 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:00 crc kubenswrapper[4922]: I0929 09:45:00.922400 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:00Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.451212 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.451287 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.451328 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.451513 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.451632 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.451722 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.666051 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75" exitCode=0 Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.666097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75"} Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.687691 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.704889 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.722212 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.735069 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.767347 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.782153 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.799616 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.816225 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.833995 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.852471 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.869481 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.890170 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.903930 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.910748 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.915707 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.916173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.916224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.916246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.916390 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.924030 4922 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.924520 4922 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.925948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.925992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.926012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.926033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.926045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:01Z","lastTransitionTime":"2025-09-29T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.939706 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.943401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.943438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.943448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.943464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.943474 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:01Z","lastTransitionTime":"2025-09-29T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.957465 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.960934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.960965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.960974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.960989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.961001 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:01Z","lastTransitionTime":"2025-09-29T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.978632 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.985020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.985085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.985108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.985136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:01 crc kubenswrapper[4922]: I0929 09:45:01.985154 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:01Z","lastTransitionTime":"2025-09-29T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:01 crc kubenswrapper[4922]: E0929 09:45:01.999424 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:01Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.005520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.005567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.005584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.005606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.005622 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: E0929 09:45:02.019508 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: E0929 09:45:02.019623 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.021093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.021137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.021150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.021167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.021176 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.123330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.123375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.123390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.123411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.123425 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.226305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.226359 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.226375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.226398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.226415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.329581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.329654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.329677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.329707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.329726 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.432997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.433061 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.433079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.433106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.433126 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.537368 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.537455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.537472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.537536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.537553 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.640882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.640929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.640950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.640976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.640995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.675085 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea" exitCode=0 Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.675209 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.701945 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.722754 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.745696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.745740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.745752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.745770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.745782 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.746583 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.765625 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.782759 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.811906 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.841735 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.849360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.849391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.849400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.849415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.849432 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.861332 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.873658 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.886374 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.900516 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.917684 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.934488 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.948225 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:02Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.951946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.951983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.951991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.952004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:02 crc kubenswrapper[4922]: I0929 09:45:02.952014 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:02Z","lastTransitionTime":"2025-09-29T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.054849 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.054900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.054911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.054928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.054938 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.157419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.157456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.157469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.157484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.157494 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.262651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.262702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.262711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.262730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.262762 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.264876 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:45:11.264855149 +0000 UTC m=+36.631085413 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.264875 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.265956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.266007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.266047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.266091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266355 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266399 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266420 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266488 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:11.266464463 +0000 UTC m=+36.632694757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266950 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266953 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.267013 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.267032 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.267046 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.266996 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:11.266986606 +0000 UTC m=+36.633216870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.267087 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:11.267076999 +0000 UTC m=+36.633307453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.267102 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:11.267096589 +0000 UTC m=+36.633326853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.365327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.365376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.365388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.365407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.365420 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.451222 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.451325 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.451400 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.451485 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.451639 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:03 crc kubenswrapper[4922]: E0929 09:45:03.451733 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.470736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.470785 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.470796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.470846 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.470862 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.574121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.574161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.574173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.574192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.574204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.677035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.677089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.677104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.677125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.677140 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.684035 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.684388 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.690988 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerStarted","Data":"30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.711459 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.721206 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.728418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.748263 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.765339 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.780121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.780163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.780173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.780191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.780202 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.791226 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.811650 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.827048 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.842131 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.857810 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.873303 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.877721 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-h9kvt"] Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.878262 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.880411 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.881520 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.881562 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.882649 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.883206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.883252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.883268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.883294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.883310 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.893790 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.912706 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.930096 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.944823 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.963335 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.973758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghjp\" (UniqueName: \"kubernetes.io/projected/0358d9bd-7f9c-49c4-9690-ee1fee839c52-kube-api-access-cghjp\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.973823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0358d9bd-7f9c-49c4-9690-ee1fee839c52-host\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.973899 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0358d9bd-7f9c-49c4-9690-ee1fee839c52-serviceca\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.977409 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:03Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.987920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.988599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.988727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.988823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:03 crc kubenswrapper[4922]: I0929 09:45:03.988939 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:03Z","lastTransitionTime":"2025-09-29T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.007702 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.031668 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.048657 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.063585 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.075329 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghjp\" (UniqueName: \"kubernetes.io/projected/0358d9bd-7f9c-49c4-9690-ee1fee839c52-kube-api-access-cghjp\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.075580 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0358d9bd-7f9c-49c4-9690-ee1fee839c52-host\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.075689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0358d9bd-7f9c-49c4-9690-ee1fee839c52-serviceca\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.075700 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0358d9bd-7f9c-49c4-9690-ee1fee839c52-host\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.077359 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0358d9bd-7f9c-49c4-9690-ee1fee839c52-serviceca\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.079247 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.091871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.091937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.091956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.091984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.092002 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.096008 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.103373 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghjp\" (UniqueName: \"kubernetes.io/projected/0358d9bd-7f9c-49c4-9690-ee1fee839c52-kube-api-access-cghjp\") pod \"node-ca-h9kvt\" (UID: \"0358d9bd-7f9c-49c4-9690-ee1fee839c52\") " pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.113971 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.132769 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.151418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.164143 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.186299 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.194979 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h9kvt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.198787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.198921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.199000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.199076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.199135 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.203825 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: W0929 09:45:04.210720 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0358d9bd_7f9c_49c4_9690_ee1fee839c52.slice/crio-56f775b414d0754a352286bb95e542de7cec5f5573a8896d67ff726f42296127 WatchSource:0}: Error finding container 56f775b414d0754a352286bb95e542de7cec5f5573a8896d67ff726f42296127: Status 404 returned error can't find the container with id 56f775b414d0754a352286bb95e542de7cec5f5573a8896d67ff726f42296127 Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.223991 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.304290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.304357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.304375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.304402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.304423 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.407490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.407567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.407589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.407617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.407642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.509892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.509924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.509932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.509946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.509955 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.612623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.612673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.612685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.612704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.612720 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.695748 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f" exitCode=0 Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.695819 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.697955 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h9kvt" event={"ID":"0358d9bd-7f9c-49c4-9690-ee1fee839c52","Type":"ContainerStarted","Data":"f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.698493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h9kvt" event={"ID":"0358d9bd-7f9c-49c4-9690-ee1fee839c52","Type":"ContainerStarted","Data":"56f775b414d0754a352286bb95e542de7cec5f5573a8896d67ff726f42296127"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.698535 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.698559 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.713647 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.716615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.716706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.716730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.716766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.716793 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.794965 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.797500 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.812465 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.819021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.819071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.819084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.819102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.819117 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.827002 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.845193 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.862470 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.876888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.887309 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.906215 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.919340 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.922039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.922073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.922084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.922099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.922111 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:04Z","lastTransitionTime":"2025-09-29T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.931307 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.956886 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:04 crc kubenswrapper[4922]: I0929 09:45:04.984884 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.001675 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.014241 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.024609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.024850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.024944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.025039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.024612 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.025131 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.038646 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.052496 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.061606 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.079144 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.101300 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.120711 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.128233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.128276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.128289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.128307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.128319 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.138701 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.157605 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.168781 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.184144 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.196166 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.208264 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.221283 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231479 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.231738 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.334751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.334817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.334863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.334888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.334907 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.438268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.438346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.438374 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.438408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.438432 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.451582 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.451688 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:05 crc kubenswrapper[4922]: E0929 09:45:05.451801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.451620 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:05 crc kubenswrapper[4922]: E0929 09:45:05.451987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:05 crc kubenswrapper[4922]: E0929 09:45:05.452115 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.477356 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.494731 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.511275 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.526496 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.539275 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.540882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.540917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.540927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.540942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.540953 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.551809 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.562162 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.578006 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.590179 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.603124 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.620904 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.634364 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.643173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.643208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.643217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.643231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.643240 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.654518 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.674312 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.690180 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.711514 4922 generic.go:334] "Generic (PLEG): container finished" podID="8c029143-44a6-410b-8496-24f92c58bb8f" containerID="1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd" exitCode=0 Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.711583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerDied","Data":"1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.743120 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.752074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.752127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.752141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.752163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.752182 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.771072 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.785449 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.796552 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.811949 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.825081 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.835653 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.850654 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.854588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.854610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.854619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.854632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.854642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.865280 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.878175 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.898346 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.911459 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.925503 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.956147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.956182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.956193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.956210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:05 crc kubenswrapper[4922]: I0929 09:45:05.956224 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:05Z","lastTransitionTime":"2025-09-29T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.016386 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.051736 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.058753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.058826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.058882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.058903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.058916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.160990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.161036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.161048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.161065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.161078 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.263480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.263526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.263538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.263558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.263571 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.367114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.367194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.367218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.367250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.367272 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.469521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.469560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.469569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.469583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.469592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.573811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.573886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.573897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.573915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.573930 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.677778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.677868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.677882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.677908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.677921 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.722709 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/0.log" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.728117 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53" exitCode=1 Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.728235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.729476 4922 scope.go:117] "RemoveContainer" containerID="7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.736076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" event={"ID":"8c029143-44a6-410b-8496-24f92c58bb8f","Type":"ContainerStarted","Data":"b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.753322 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.771873 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.784161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.784208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.784220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.784240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.784253 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.788924 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.815956 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.832941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.846220 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.858432 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.879146 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:45:06.214183 6135 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:45:06.214255 6135 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:45:06.214293 6135 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:45:06.214298 6135 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 09:45:06.214365 6135 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:45:06.214402 6135 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 09:45:06.214412 6135 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:45:06.214445 6135 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09:45:06.214471 6135 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:45:06.214482 6135 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 09:45:06.214492 6135 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 09:45:06.214516 6135 factory.go:656] Stopping watch factory\\\\nI0929 09:45:06.214548 6135 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 09:45:06.214571 6135 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 09:45:06.214586 6135 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.888663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.888712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.888722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.888737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.888746 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.895688 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.910473 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.922456 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.937055 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.955996 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.969348 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.981369 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.990637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.990658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.990666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.990679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.990687 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:06Z","lastTransitionTime":"2025-09-29T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:06 crc kubenswrapper[4922]: I0929 09:45:06.993664 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:06Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.008042 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.023116 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.035806 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.050504 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.067035 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.080637 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.093296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.093328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.093339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.093354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.093366 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.097741 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.117738 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.131176 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.165610 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:45:06.214183 6135 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:45:06.214255 6135 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:45:06.214293 6135 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:45:06.214298 6135 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 09:45:06.214365 6135 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:45:06.214402 6135 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 09:45:06.214412 6135 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:45:06.214445 6135 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09:45:06.214471 6135 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:45:06.214482 6135 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 09:45:06.214492 6135 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 09:45:06.214516 6135 factory.go:656] Stopping watch factory\\\\nI0929 09:45:06.214548 6135 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 09:45:06.214571 6135 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 09:45:06.214586 6135 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.195406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.195455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.195469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.195490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.195502 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.197174 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.246302 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.266272 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.277463 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.298547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.298578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.298587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.298602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.298611 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.401722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.401750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.401759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.401771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.401779 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.453516 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:07 crc kubenswrapper[4922]: E0929 09:45:07.453617 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.453954 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:07 crc kubenswrapper[4922]: E0929 09:45:07.453997 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.454049 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:07 crc kubenswrapper[4922]: E0929 09:45:07.454086 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.505360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.505900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.505913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.505937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.505952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.609744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.609802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.609821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.609872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.609893 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.713862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.713965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.713991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.714031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.714056 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.745277 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/1.log" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.746403 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/0.log" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.751879 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e" exitCode=1 Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.753111 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.753224 4922 scope.go:117] "RemoveContainer" containerID="7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.754016 4922 scope.go:117] "RemoveContainer" containerID="8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e" Sep 29 09:45:07 crc kubenswrapper[4922]: E0929 09:45:07.754274 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.776055 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.795090 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.811314 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.819211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.819273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.819290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.819312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.819327 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.833421 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.852717 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.870975 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.906925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc3b4d8f6edc64634bd763e461ec3e35ff65631de366d7f26ed5ad436283c53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 09:45:06.214183 6135 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0929 09:45:06.214255 6135 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0929 09:45:06.214293 6135 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0929 09:45:06.214298 6135 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0929 09:45:06.214365 6135 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0929 09:45:06.214402 6135 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0929 09:45:06.214412 6135 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0929 09:45:06.214445 6135 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0929 09:45:06.214471 6135 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0929 09:45:06.214482 6135 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0929 09:45:06.214492 6135 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0929 09:45:06.214516 6135 factory.go:656] Stopping watch factory\\\\nI0929 09:45:06.214548 6135 handler.go:208] Removed *v1.Node event handler 7\\\\nI0929 09:45:06.214571 6135 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0929 09:45:06.214586 6135 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.923038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.923078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.923089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.923106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.923118 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:07Z","lastTransitionTime":"2025-09-29T09:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.944070 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.963705 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.977706 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:07 crc kubenswrapper[4922]: I0929 09:45:07.990504 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:07Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.016455 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.025765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.025821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.025858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.025878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.025892 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.034820 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.050175 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.072646 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.128719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.128760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.128772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.128789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.128803 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.231308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.231755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.232010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.232168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.232321 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.335465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.335513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.335526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.335547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.335561 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.439121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.439214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.439238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.439280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.439306 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.542148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.542200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.542211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.542228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.542240 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.644870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.644942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.644955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.644978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.644993 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.747824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.747939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.747958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.747987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.748007 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.758588 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/1.log" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.763788 4922 scope.go:117] "RemoveContainer" containerID="8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e" Sep 29 09:45:08 crc kubenswrapper[4922]: E0929 09:45:08.764132 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.786069 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.807587 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.827875 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.852031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.852094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.852108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.852133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.852151 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.854168 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.884221 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.912360 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.929679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.944814 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.955118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.955195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.955214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.955241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.955259 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:08Z","lastTransitionTime":"2025-09-29T09:45:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.971062 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:08 crc kubenswrapper[4922]: I0929 09:45:08.990325 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:08Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.007646 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.026739 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.050301 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.058299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.058351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.058363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.058382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.058396 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.071207 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.088689 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:09Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.161519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.161609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.161632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.161663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.161687 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.265089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.265184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.265205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.265235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.265254 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.368496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.368561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.368573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.368594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.368606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.450863 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.450911 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.450915 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:09 crc kubenswrapper[4922]: E0929 09:45:09.451110 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:09 crc kubenswrapper[4922]: E0929 09:45:09.451212 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:09 crc kubenswrapper[4922]: E0929 09:45:09.451363 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.471758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.471855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.471876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.471906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.471925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.575535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.575619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.575638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.575667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.575690 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.679404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.679481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.679501 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.679528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.679547 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.783266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.783348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.783366 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.783395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.783417 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.886949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.887316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.887380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.887452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.887576 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.990722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.990801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.990824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.990896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:09 crc kubenswrapper[4922]: I0929 09:45:09.990919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:09Z","lastTransitionTime":"2025-09-29T09:45:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.101195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.101255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.101271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.101293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.101312 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.204702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.204774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.204797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.204827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.204885 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.307784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.307877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.307895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.307936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.307952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.411364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.411415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.411426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.411444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.411457 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.451976 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h"] Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.452450 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.455733 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.456443 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.474584 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.486777 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.495030 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh7wj\" (UniqueName: \"kubernetes.io/projected/c3f01653-0511-4f73-ade6-c1d7f351e3e1-kube-api-access-nh7wj\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.495122 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.495176 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.495223 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.503875 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.513544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.513590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.513606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.513629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.513644 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.520368 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.531638 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.546266 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.573904 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.588774 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.596092 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.596123 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.596163 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh7wj\" (UniqueName: \"kubernetes.io/projected/c3f01653-0511-4f73-ade6-c1d7f351e3e1-kube-api-access-nh7wj\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.596196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.596811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.597103 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.602813 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.612108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3f01653-0511-4f73-ade6-c1d7f351e3e1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.614606 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh7wj\" (UniqueName: \"kubernetes.io/projected/c3f01653-0511-4f73-ade6-c1d7f351e3e1-kube-api-access-nh7wj\") pod \"ovnkube-control-plane-749d76644c-b7k8h\" (UID: \"c3f01653-0511-4f73-ade6-c1d7f351e3e1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.616936 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.617090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.617130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.617148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.617173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.617190 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.636330 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.649954 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.663446 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.683364 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.696426 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.714345 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:10Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.720527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.720592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.720615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.720646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.720669 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.768793 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" Sep 29 09:45:10 crc kubenswrapper[4922]: W0929 09:45:10.793884 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f01653_0511_4f73_ade6_c1d7f351e3e1.slice/crio-821c306049c272f75034003aeb689fa812360ce61bb029dffc9cd91a9e598792 WatchSource:0}: Error finding container 821c306049c272f75034003aeb689fa812360ce61bb029dffc9cd91a9e598792: Status 404 returned error can't find the container with id 821c306049c272f75034003aeb689fa812360ce61bb029dffc9cd91a9e598792 Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.823097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.823127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.823136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.823249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.823260 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.926444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.926490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.926501 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.926518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:10 crc kubenswrapper[4922]: I0929 09:45:10.926529 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:10Z","lastTransitionTime":"2025-09-29T09:45:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.029673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.029742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.029762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.029791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.029815 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.133869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.133913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.133923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.133942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.133954 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.236622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.236665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.236679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.236697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.236708 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.302936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.303114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.303148 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.303196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303244 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.303201585 +0000 UTC m=+52.669432029 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303301 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303353 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303374 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303387 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303420 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.303395371 +0000 UTC m=+52.669625835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.303335 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303491 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303519 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303532 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303539 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303541 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.303433562 +0000 UTC m=+52.669664036 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303638 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.303623957 +0000 UTC m=+52.669854221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.303659 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.303650207 +0000 UTC m=+52.669880481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.342536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.342610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.342628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.342657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.342671 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.445689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.445722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.445731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.445743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.445752 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.451199 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.451209 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.451295 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.451316 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.451448 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.451534 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.549020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.549125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.549238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.549538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.549585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.571520 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9p9s8"] Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.572426 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.572538 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.580820 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.590892 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.606267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsvm\" (UniqueName: \"kubernetes.io/projected/48a99f27-a7b4-466d-b130-026774744f7d-kube-api-access-7jsvm\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.606347 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.608306 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.628107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.645149 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.652186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.652251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.652270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.652296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.652316 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.662265 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.674389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.690112 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.704098 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.707401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsvm\" (UniqueName: \"kubernetes.io/projected/48a99f27-a7b4-466d-b130-026774744f7d-kube-api-access-7jsvm\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.707481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.707665 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: E0929 09:45:11.707775 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:12.207750789 +0000 UTC m=+37.573981083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.719710 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.724819 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsvm\" (UniqueName: \"kubernetes.io/projected/48a99f27-a7b4-466d-b130-026774744f7d-kube-api-access-7jsvm\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.730665 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.750383 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.755063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.755182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.755208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.755273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.755296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.773458 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" event={"ID":"c3f01653-0511-4f73-ade6-c1d7f351e3e1","Type":"ContainerStarted","Data":"5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.773512 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" event={"ID":"c3f01653-0511-4f73-ade6-c1d7f351e3e1","Type":"ContainerStarted","Data":"ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.773525 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" event={"ID":"c3f01653-0511-4f73-ade6-c1d7f351e3e1","Type":"ContainerStarted","Data":"821c306049c272f75034003aeb689fa812360ce61bb029dffc9cd91a9e598792"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.783088 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.796354 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.808706 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.826406 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.844554 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.857877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.857926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.857937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.857956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.857971 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.860606 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.880498 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.900927 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.916407 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.932573 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.958907 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.960989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.961047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.961061 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.961083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.961097 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:11Z","lastTransitionTime":"2025-09-29T09:45:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.984458 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:11 crc kubenswrapper[4922]: I0929 09:45:11.999344 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:11Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.019458 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.049784 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.064627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.064667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.064675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.064690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.064700 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.073012 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.087916 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.104787 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.140588 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.162366 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.167238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.167272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.167290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.167311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.167327 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.180357 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.194730 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.213230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.213392 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.213464 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:13.21343919 +0000 UTC m=+38.579669464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.222753 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.270797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.270865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.270878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.270895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.270907 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.373976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.374045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.374064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.374097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.374117 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.381232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.381281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.381299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.381323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.381342 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.401330 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.408598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.408663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.408682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.408710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.408729 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.431641 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.440686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.440743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.440755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.440772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.440786 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.462603 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.467506 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.467584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.467609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.467625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.467635 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.486618 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.490982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.491040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.491053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.491072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.491084 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.510661 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:12Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:12 crc kubenswrapper[4922]: E0929 09:45:12.510963 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.513167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.513220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.513237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.513267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.513283 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.616220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.616274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.616287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.616311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.616324 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.719129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.719168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.719177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.719197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.719205 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.822054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.822101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.822112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.822132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.822145 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.924870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.924942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.924972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.924995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:12 crc kubenswrapper[4922]: I0929 09:45:12.925007 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:12Z","lastTransitionTime":"2025-09-29T09:45:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.028870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.028940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.028962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.028995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.029019 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.132090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.132143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.132160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.132184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.132204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.224280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.224472 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.224542 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:15.224520186 +0000 UTC m=+40.590750460 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.234975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.235020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.235034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.235052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.235064 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.337380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.337426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.337438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.337455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.337468 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.444397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.444472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.444504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.444576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.444602 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.450900 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.450993 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.450919 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.451120 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.451372 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.451442 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.451525 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:13 crc kubenswrapper[4922]: E0929 09:45:13.451687 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.547400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.547480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.547504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.547534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.547558 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.651563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.651634 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.651656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.651679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.651696 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.755168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.755219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.755230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.755246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.755259 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.858592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.858658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.858675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.858699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.858716 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.961882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.961947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.961982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.962007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:13 crc kubenswrapper[4922]: I0929 09:45:13.962025 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:13Z","lastTransitionTime":"2025-09-29T09:45:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.065056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.065107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.065121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.065138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.065151 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.167647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.167714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.167738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.167767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.167788 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.271506 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.271586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.271608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.271637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.271658 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.375528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.375601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.375617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.375642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.375663 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.479314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.479382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.479408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.479440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.479465 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.582106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.582166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.582184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.582210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.582227 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.685069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.685110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.685121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.685139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.685149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.788744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.788921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.789055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.789092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.789121 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.892401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.893269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.893345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.893452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.893648 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.997484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.997533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.997546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.997568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:14 crc kubenswrapper[4922]: I0929 09:45:14.997581 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:14Z","lastTransitionTime":"2025-09-29T09:45:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.101056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.101126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.101139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.101155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.101165 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.204908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.204975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.204999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.205031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.205069 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.246524 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.246736 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.246825 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:19.246799209 +0000 UTC m=+44.613029503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.308514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.308583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.308605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.308630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.308647 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.411771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.411894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.411921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.411955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.411983 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.451504 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.451568 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.451658 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.451777 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.451941 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.451998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.452211 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:15 crc kubenswrapper[4922]: E0929 09:45:15.452417 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.473138 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.489287 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.506330 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.514708 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.514764 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.514781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.514806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.514824 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.530750 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.553580 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.566710 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.583129 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.606088 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.621561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.621638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.621656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.622304 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.622415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.622271 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.645740 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.662705 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.682783 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.700996 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.718312 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.726163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.726218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.726233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.726254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.726269 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.733929 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.747968 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.764567 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.829477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.829543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.829562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.829587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.829605 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.932957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.933076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.933097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.933125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:15 crc kubenswrapper[4922]: I0929 09:45:15.933149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:15Z","lastTransitionTime":"2025-09-29T09:45:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.037066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.037700 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.037734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.037767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.037792 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.141371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.141435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.141453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.141477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.141495 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.244788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.244875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.244893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.244917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.244934 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.349038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.349136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.349153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.349177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.349194 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.452128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.452199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.452221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.452252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.452272 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.555468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.555527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.555550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.555580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.555605 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.659004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.659091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.659112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.659148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.659169 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.762645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.762719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.762741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.762774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.762791 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.867592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.867675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.867696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.867725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.867747 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.970639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.970698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.970717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.970742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:16 crc kubenswrapper[4922]: I0929 09:45:16.970761 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:16Z","lastTransitionTime":"2025-09-29T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.074715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.074794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.074811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.074862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.074882 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.178013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.178093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.178114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.178142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.178161 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.280801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.280999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.281039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.281071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.281099 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.391468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.392232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.392612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.393159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.393562 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.451245 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.451392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.451270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:17 crc kubenswrapper[4922]: E0929 09:45:17.451466 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:17 crc kubenswrapper[4922]: E0929 09:45:17.451565 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.451273 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:17 crc kubenswrapper[4922]: E0929 09:45:17.451801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:17 crc kubenswrapper[4922]: E0929 09:45:17.451917 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.496368 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.496442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.496465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.496489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.496508 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.599446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.599514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.599527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.599544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.599556 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.702673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.702716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.702727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.702742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.702752 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.804478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.804512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.804521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.804536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.804547 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.907337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.907411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.907435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.907466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:17 crc kubenswrapper[4922]: I0929 09:45:17.907489 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:17Z","lastTransitionTime":"2025-09-29T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.011226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.011263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.011275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.011292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.011304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.113854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.113914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.113929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.113951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.113964 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.217401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.217485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.217509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.217541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.217564 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.319962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.320000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.320011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.320030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.320042 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.423943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.424002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.424028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.424056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.424118 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.528192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.528291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.528320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.528356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.528376 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.632364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.632439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.632456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.632573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.632640 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.735387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.735443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.735456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.735478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.735496 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.838472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.838548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.838567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.838595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.838613 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.942153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.942229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.942244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.942279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:18 crc kubenswrapper[4922]: I0929 09:45:18.942302 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:18Z","lastTransitionTime":"2025-09-29T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.046524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.046587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.046609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.046636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.046653 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.150377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.150431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.150448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.150473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.150490 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.253621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.254091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.254260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.254295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.254307 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.320878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.321100 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.321184 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:27.321159617 +0000 UTC m=+52.687389921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.357188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.357258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.357332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.357362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.357380 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.451396 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.451468 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.451634 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.451764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.451906 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.452143 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.452315 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:19 crc kubenswrapper[4922]: E0929 09:45:19.452371 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.461432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.461484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.461503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.461535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.461558 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.565908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.565982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.566002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.566029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.566048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.669390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.669466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.669493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.669522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.669540 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.772779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.772882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.772903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.772933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.772952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.876094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.876421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.876507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.876595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.876696 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.980187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.980452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.980542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.980632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:19 crc kubenswrapper[4922]: I0929 09:45:19.980718 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:19Z","lastTransitionTime":"2025-09-29T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.084760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.084803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.084815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.084854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.084867 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.188129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.188192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.188210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.188237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.188255 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.291376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.291442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.291465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.291497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.291519 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.394194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.394248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.394259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.394277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.394291 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.497058 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.497104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.497120 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.497136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.497150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.600140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.600246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.600275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.600311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.600330 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.702496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.702547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.702562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.702583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.702599 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.805653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.805695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.805707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.805723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.805735 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.908714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.908768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.908781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.908803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:20 crc kubenswrapper[4922]: I0929 09:45:20.908818 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:20Z","lastTransitionTime":"2025-09-29T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.011467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.011521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.011531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.011545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.011554 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.114953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.115991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.116233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.116283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.116311 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.219258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.219301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.219335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.219356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.219367 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.322659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.322750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.322767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.322792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.322808 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.425685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.425787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.425807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.425859 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.425880 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.451399 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.451463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.451406 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.451397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:21 crc kubenswrapper[4922]: E0929 09:45:21.451555 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:21 crc kubenswrapper[4922]: E0929 09:45:21.451670 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:21 crc kubenswrapper[4922]: E0929 09:45:21.451783 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:21 crc kubenswrapper[4922]: E0929 09:45:21.451888 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.529458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.529534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.529554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.529583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.529606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.633045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.633120 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.633141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.633170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.633190 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.735952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.736032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.736053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.736082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.736103 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.838971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.839330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.839499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.839640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.839765 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.942762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.942852 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.942871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.942896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:21 crc kubenswrapper[4922]: I0929 09:45:21.942913 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:21Z","lastTransitionTime":"2025-09-29T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.046196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.046618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.046822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.047022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.047168 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.149794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.149873 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.149891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.149917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.149936 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.254628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.254696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.254710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.254734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.254750 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.358062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.358455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.359185 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.359342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.359462 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.463007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.463069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.463088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.463109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.463126 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.559475 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.559915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.560072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.560229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.560378 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.584748 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.590482 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.590754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.590976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.591159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.591337 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.609301 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.614533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.614635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.614688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.614719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.614738 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.634606 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.640048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.640303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.640503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.640713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.640908 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.661613 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.666918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.666984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.667003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.667029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.667047 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.684489 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:22Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:22 crc kubenswrapper[4922]: E0929 09:45:22.684716 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.686906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.686969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.686999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.687028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.687050 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.795057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.795191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.795214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.795248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.795267 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.898184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.898251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.898287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.898328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:22 crc kubenswrapper[4922]: I0929 09:45:22.898354 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:22Z","lastTransitionTime":"2025-09-29T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.002109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.002186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.002203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.002233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.002258 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.105331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.105407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.105436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.105480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.105506 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.208019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.208073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.208086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.208108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.208124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.311007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.311088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.311104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.311535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.311579 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.414915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.414966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.414982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.415004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.415018 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.451395 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.451460 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.451503 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.451579 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:23 crc kubenswrapper[4922]: E0929 09:45:23.451570 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:23 crc kubenswrapper[4922]: E0929 09:45:23.451741 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:23 crc kubenswrapper[4922]: E0929 09:45:23.451868 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:23 crc kubenswrapper[4922]: E0929 09:45:23.451995 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.518240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.518303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.518321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.518346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.518364 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.622352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.622416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.622441 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.622473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.622499 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.725438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.725534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.725551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.725620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.725639 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.828249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.828306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.828323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.828345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.828363 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.931432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.931502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.931519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.931544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:23 crc kubenswrapper[4922]: I0929 09:45:23.931566 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:23Z","lastTransitionTime":"2025-09-29T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.034976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.035045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.035062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.035088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.035106 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.138354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.138504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.138530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.138562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.138584 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.241902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.241978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.241999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.242025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.242043 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.345089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.345154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.345171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.345197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.345216 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.448474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.448528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.448539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.448555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.448564 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.452714 4922 scope.go:117] "RemoveContainer" containerID="8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.551944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.552017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.552036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.552060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.552080 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.656926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.656980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.656999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.657027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.657045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.760078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.760149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.760166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.760195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.760213 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.830809 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/1.log" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.835172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.835789 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.863962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.864061 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.864196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.864252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.864289 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.915601 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.938029 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.967878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.967938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.967888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.967958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.967983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.968001 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:24Z","lastTransitionTime":"2025-09-29T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:24 crc kubenswrapper[4922]: I0929 09:45:24.998389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.029186 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.054323 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.070472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.070523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.070534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.070551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.070562 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.074764 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.093377 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.105523 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.121576 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.136984 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.151154 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.163090 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.176973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.177023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.177035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.177053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.177064 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.181851 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.198120 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.213295 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.227226 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.278923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.278965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.278977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.278993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.279005 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.381760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.381801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.381812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.381849 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.381861 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.450765 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.450873 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.450781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:25 crc kubenswrapper[4922]: E0929 09:45:25.450984 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.451013 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:25 crc kubenswrapper[4922]: E0929 09:45:25.451090 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:25 crc kubenswrapper[4922]: E0929 09:45:25.451271 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:25 crc kubenswrapper[4922]: E0929 09:45:25.451411 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.467806 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.484317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.484363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.484383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.484408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.484426 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.486050 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.502674 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.519442 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.544538 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.561159 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.585694 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.587949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.588012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.588039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.588070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.588095 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.607197 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.631282 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.649280 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.679936 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.690717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.690778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.690796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.690823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.690870 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.707712 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.733125 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.754260 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.771420 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.794156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.794414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.794499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.794587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.794704 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.798550 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.815941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.842118 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/2.log" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.842976 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/1.log" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.846477 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" exitCode=1 Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.846538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.846662 4922 scope.go:117] "RemoveContainer" containerID="8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.847529 4922 scope.go:117] "RemoveContainer" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" Sep 29 09:45:25 crc kubenswrapper[4922]: E0929 09:45:25.847789 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.876881 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.895319 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.897585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.897614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.897623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.897638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.897649 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:25Z","lastTransitionTime":"2025-09-29T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.908351 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.920773 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.939891 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e3917b1109bc9632a6cdfed6a94bf53acef589fd2207e8368ed6205a278976e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:07Z\\\",\\\"message\\\":\\\"LBGroup\\\\\\\"}}}\\\\nI0929 09:45:07.682064 6353 services_controller.go:452] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics per-node LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682075 6353 services_controller.go:453] Built service openshift-operator-lifecycle-manager/catalog-operator-metrics template LB for network=default: []services.LB{}\\\\nI0929 09:45:07.682086 6353 services_controller.go:454] Service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0929 09:45:07.682104 6353 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.951483 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.972351 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.984360 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:25 crc kubenswrapper[4922]: I0929 09:45:25.999888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.001230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.001295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.001343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.001378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.001406 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.015865 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.037902 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.052668 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.066269 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.090389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.104941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.105025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.105046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.105074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.105094 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.113510 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.131998 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.149643 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.208341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.208427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.208450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.208476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.208495 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.311321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.312258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.312632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.312807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.313016 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.416064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.416427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.416533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.416620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.416711 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.521044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.521116 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.522554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.522585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.522615 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.626159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.626196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.626209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.626225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.626235 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.729459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.729532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.729554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.729583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.729605 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.832998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.833075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.833098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.833128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.833150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.854317 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/2.log" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.859670 4922 scope.go:117] "RemoveContainer" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" Sep 29 09:45:26 crc kubenswrapper[4922]: E0929 09:45:26.860156 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.876526 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.892889 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.909680 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.926666 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.936408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.936517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.936576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.936597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.936609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:26Z","lastTransitionTime":"2025-09-29T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.950688 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.970153 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:26 crc kubenswrapper[4922]: I0929 09:45:26.986491 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:26Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.002813 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.023934 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.039922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.039979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.039990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.040040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.040056 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.041240 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.056611 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.074511 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.111896 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.133478 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.142627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.142687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.142706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.142732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.142752 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.156008 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.177230 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.210404 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:27Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.246082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.246130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.246144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.246166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.246179 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.318230 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.318383 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.318441 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318502 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:45:59.318461368 +0000 UTC m=+84.684691672 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.318575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318619 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.318642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318667 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318715 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318728 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:59.318696455 +0000 UTC m=+84.684926759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318740 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318797 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318816 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:59.318791327 +0000 UTC m=+84.685021651 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318909 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318926 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:59.31889875 +0000 UTC m=+84.685129064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318933 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.318964 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.319033 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:45:59.319014213 +0000 UTC m=+84.685244517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.352284 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.352350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.352368 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.352393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.352410 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.419562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.419737 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.419804 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:45:43.419783431 +0000 UTC m=+68.786013705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.451629 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.451680 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.451879 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.451956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.452084 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.452350 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.453043 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:27 crc kubenswrapper[4922]: E0929 09:45:27.453110 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.457618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.457663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.457680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.457704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.457725 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.561381 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.561447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.561465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.561490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.561509 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.665025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.665099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.665118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.665143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.665161 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.770367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.770451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.770476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.770508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.770536 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.874143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.874210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.874227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.874252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.874270 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.978139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.978186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.978206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.978222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:27 crc kubenswrapper[4922]: I0929 09:45:27.978233 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:27Z","lastTransitionTime":"2025-09-29T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.080685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.080719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.080727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.080741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.080750 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.184109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.184150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.184162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.184182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.184193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.287155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.287213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.287230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.287254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.287272 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.390071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.390143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.390167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.390203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.390229 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.497889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.499978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.500004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.500033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.500051 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.603576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.604122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.604133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.604154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.604167 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.707683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.707746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.707763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.707790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.707808 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.812103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.812151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.812164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.812183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.812193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.915888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.916299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.916385 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.916539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:28 crc kubenswrapper[4922]: I0929 09:45:28.916619 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:28Z","lastTransitionTime":"2025-09-29T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.020899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.020969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.020986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.021011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.021028 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.123784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.123863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.123882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.123910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.123927 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.227500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.227547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.227557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.227573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.227585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.329995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.330057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.330069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.330090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.330102 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.433260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.433348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.433383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.433416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.433440 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.450942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.451047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.451054 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:29 crc kubenswrapper[4922]: E0929 09:45:29.451641 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.451185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:29 crc kubenswrapper[4922]: E0929 09:45:29.451979 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:29 crc kubenswrapper[4922]: E0929 09:45:29.452124 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:29 crc kubenswrapper[4922]: E0929 09:45:29.452332 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.536893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.536945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.536959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.536977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.536988 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.639778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.639907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.639928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.639952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.639971 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.744081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.744985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.745316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.745552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.745686 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.848164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.848536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.848691 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.848891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.849066 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.951499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.951808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.952025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.952213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:29 crc kubenswrapper[4922]: I0929 09:45:29.952342 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:29Z","lastTransitionTime":"2025-09-29T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.055560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.055979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.056186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.056375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.056544 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.159531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.159587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.159605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.159629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.159647 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.267742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.267871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.267898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.267924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.267942 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.371041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.371100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.371112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.371136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.371151 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.475469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.475520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.475536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.475560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.475577 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.577922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.577963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.577971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.577984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.578015 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.682669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.682731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.682744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.682761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.682774 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.786046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.786091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.786109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.786131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.786148 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.888475 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.888611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.888674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.888701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.888753 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.992104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.992163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.992182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.992210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:30 crc kubenswrapper[4922]: I0929 09:45:30.992229 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:30Z","lastTransitionTime":"2025-09-29T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.096438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.096500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.096525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.096553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.096574 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.199708 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.199794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.199818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.199879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.199899 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.303065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.303107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.303118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.303139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.303150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.405594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.405646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.405657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.405672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.405680 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.451493 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.451589 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:31 crc kubenswrapper[4922]: E0929 09:45:31.451733 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.451757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.451743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:31 crc kubenswrapper[4922]: E0929 09:45:31.451796 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:31 crc kubenswrapper[4922]: E0929 09:45:31.452239 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:31 crc kubenswrapper[4922]: E0929 09:45:31.452058 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.509042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.509130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.509157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.509193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.509220 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.612019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.612075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.612088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.612107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.612119 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.714530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.714614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.714639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.714670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.714755 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.817954 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.817993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.818008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.818028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.818042 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.823625 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.840293 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.845987 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.866764 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.887692 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.908643 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.921439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.921487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.921503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.921528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.921546 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:31Z","lastTransitionTime":"2025-09-29T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.957531 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:31 crc kubenswrapper[4922]: I0929 09:45:31.990407 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:31Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.007680 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.022125 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.024202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.024265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.024283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.024308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.024325 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.044309 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.059470 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.072628 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.087051 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.115960 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.127541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.127600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.127611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.127628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.127639 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.135589 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.158207 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.180889 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.212097 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.230007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.230087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.230110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.230142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.230165 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.333521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.333573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.333586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.333605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.333617 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.436865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.436936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.436953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.436979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.436997 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.540554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.540619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.540642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.540673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.540696 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.644098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.644140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.644152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.644172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.644185 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.747350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.747388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.747395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.747409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.747418 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.850232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.850267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.850275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.850288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.850296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.953988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.954068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.954091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.954122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.954145 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.961939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.962006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.962024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.962049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.962070 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: E0929 09:45:32.976661 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.980375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.980428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.980439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.980458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.980469 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:32 crc kubenswrapper[4922]: E0929 09:45:32.995880 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:32Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.999820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.999895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.999906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:32 crc kubenswrapper[4922]: I0929 09:45:32.999921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:32.999935 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:32Z","lastTransitionTime":"2025-09-29T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.014337 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:33Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.018777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.018825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.018889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.018920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.018947 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.037845 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:33Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.043390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.043456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.043476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.043502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.043521 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.063812 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:33Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.064327 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.066196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.066239 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.066254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.066273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.066287 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.170890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.170935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.170951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.170974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.170991 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.276446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.276598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.276622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.276647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.276664 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.379754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.379862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.379882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.379907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.379925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.451655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.451698 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.451748 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.451939 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.451938 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.452026 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.452142 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:33 crc kubenswrapper[4922]: E0929 09:45:33.452244 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.484259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.484292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.484302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.484315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.484324 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.587153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.587219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.587237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.587263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.587279 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.690168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.690232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.690248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.690279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.690297 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.793125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.793201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.793220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.793245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.793263 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.896211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.896293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.896317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.896347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.896371 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.999135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.999195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.999207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.999223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:33 crc kubenswrapper[4922]: I0929 09:45:33.999234 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:33Z","lastTransitionTime":"2025-09-29T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.103181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.103246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.103282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.103315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.103335 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.206647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.206710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.206735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.206764 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.206784 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.309935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.310006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.310030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.310059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.310075 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.413296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.413344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.413360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.413385 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.413404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.515773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.515832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.515862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.515882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.515894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.618593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.618630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.618642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.618658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.618671 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.722011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.722062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.722075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.722089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.722098 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.824182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.824239 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.824255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.824277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.824294 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.926605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.926662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.926677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.926702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:34 crc kubenswrapper[4922]: I0929 09:45:34.926721 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:34Z","lastTransitionTime":"2025-09-29T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.029235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.029290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.029308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.029329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.029346 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.132695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.132753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.132771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.132795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.132814 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.236055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.236125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.236142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.236169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.236188 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.338589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.338643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.338651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.338665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.338673 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.441454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.441529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.441551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.441583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.441604 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.450762 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.450882 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.451044 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:35 crc kubenswrapper[4922]: E0929 09:45:35.451026 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:35 crc kubenswrapper[4922]: E0929 09:45:35.451205 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.451401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:35 crc kubenswrapper[4922]: E0929 09:45:35.451531 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:35 crc kubenswrapper[4922]: E0929 09:45:35.451689 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.471607 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.501647 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.536812 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.543458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.543502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.543514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.543532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.543544 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.556691 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.575814 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.595170 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.614568 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.631905 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.646631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.646662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.646674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.646690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.646701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.650343 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.673209 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.690662 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.707520 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.722372 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.737975 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.749811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.750128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.750266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.750416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.750554 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.756950 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.776510 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.792891 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.809941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:35Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.854105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.854166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.854179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.854199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.854212 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.956995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.957046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.957055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.957073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:35 crc kubenswrapper[4922]: I0929 09:45:35.957084 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:35Z","lastTransitionTime":"2025-09-29T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.060448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.060494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.060510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.060533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.060549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.163088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.163138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.163151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.163172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.163184 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.268589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.269034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.269240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.269427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.269635 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.373358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.373396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.373405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.373418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.373427 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.476291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.476406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.476418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.476435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.476446 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.579146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.579194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.579208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.579226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.579238 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.682499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.682560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.682576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.682600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.682619 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.786090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.786171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.786192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.786221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.786245 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.889234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.889310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.889327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.889350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.889367 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.992192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.992297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.992322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.992353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:36 crc kubenswrapper[4922]: I0929 09:45:36.992374 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:36Z","lastTransitionTime":"2025-09-29T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.096316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.096396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.096419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.096449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.096467 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.200478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.200551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.200569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.200599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.200617 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.304129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.304195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.304212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.304237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.304256 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.421658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.421716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.421734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.421759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.421775 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.451773 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.451901 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.451906 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:37 crc kubenswrapper[4922]: E0929 09:45:37.451971 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:37 crc kubenswrapper[4922]: E0929 09:45:37.452169 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.452210 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:37 crc kubenswrapper[4922]: E0929 09:45:37.452281 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:37 crc kubenswrapper[4922]: E0929 09:45:37.452373 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.524772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.524865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.524885 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.524910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.524932 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.628277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.628326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.628343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.628366 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.628384 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.731971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.733145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.733214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.733258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.733284 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.836648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.836715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.836735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.836762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.836789 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.939401 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.939469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.939486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.939510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:37 crc kubenswrapper[4922]: I0929 09:45:37.939526 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:37Z","lastTransitionTime":"2025-09-29T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.042750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.042881 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.042908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.042941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.042966 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.146072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.146160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.146181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.146206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.146225 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.249064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.249129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.249147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.249172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.249189 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.351751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.351832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.351891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.351921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.351944 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.453974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.454020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.454031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.454048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.454060 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.556026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.556097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.556123 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.556156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.556180 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.658650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.658723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.658741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.658767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.658787 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.761879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.761916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.761929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.761946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.761960 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.864695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.864730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.864739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.864751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.864761 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.967666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.967940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.967961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.967983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:38 crc kubenswrapper[4922]: I0929 09:45:38.967995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:38Z","lastTransitionTime":"2025-09-29T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.070782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.070831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.070858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.070877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.070889 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.173360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.173428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.173445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.173468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.173483 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.275324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.275357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.275365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.275380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.275390 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.378100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.378140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.378150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.378164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.378172 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.451588 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.451647 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:39 crc kubenswrapper[4922]: E0929 09:45:39.451710 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.451789 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:39 crc kubenswrapper[4922]: E0929 09:45:39.451896 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.451939 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:39 crc kubenswrapper[4922]: E0929 09:45:39.451961 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:39 crc kubenswrapper[4922]: E0929 09:45:39.452117 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.480544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.480598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.480613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.480633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.480646 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.583644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.583703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.583720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.583743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.583759 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.687617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.687675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.687692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.687718 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.687735 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.790601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.790644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.790657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.790674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.790685 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.893306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.893392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.893415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.893447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.893469 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.995672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.995725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.995739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.995755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:39 crc kubenswrapper[4922]: I0929 09:45:39.995764 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:39Z","lastTransitionTime":"2025-09-29T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.099137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.099177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.099186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.099202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.099214 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.202037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.202099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.202131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.202172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.202196 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.305322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.305357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.305365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.305379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.305387 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.408719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.408779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.408797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.408825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.408872 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.517244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.517309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.517338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.517369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.517391 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.624317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.624358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.624367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.624382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.624392 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.727420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.727469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.727484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.727502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.727515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.830166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.830255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.830274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.830299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.830317 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.933480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.933541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.933562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.933589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:40 crc kubenswrapper[4922]: I0929 09:45:40.933606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:40Z","lastTransitionTime":"2025-09-29T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.037121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.037196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.037219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.037249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.037269 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.141582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.141636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.141654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.141680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.141698 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.245328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.245374 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.245386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.245432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.245446 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.348369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.348452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.348473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.348499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.348520 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451350 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451281 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:41 crc kubenswrapper[4922]: E0929 09:45:41.451409 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451660 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.451792 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: E0929 09:45:41.451789 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:41 crc kubenswrapper[4922]: E0929 09:45:41.451984 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.452021 4922 scope.go:117] "RemoveContainer" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" Sep 29 09:45:41 crc kubenswrapper[4922]: E0929 09:45:41.452037 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:41 crc kubenswrapper[4922]: E0929 09:45:41.452598 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.554863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.554908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.554921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.554938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.554948 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.657567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.657601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.657609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.657624 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.657633 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.760729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.760777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.760788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.761009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.761019 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.863675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.863736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.863752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.863777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.863794 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.967098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.967131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.967139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.967152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:41 crc kubenswrapper[4922]: I0929 09:45:41.967162 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:41Z","lastTransitionTime":"2025-09-29T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.070351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.070386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.070394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.070409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.070418 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.172448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.172483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.172491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.172505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.172516 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.274376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.274419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.274428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.274442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.274452 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.377341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.377451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.377471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.377498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.377520 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.479550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.479616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.479647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.479675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.479694 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.581709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.581747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.581760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.581775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.581784 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.684316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.684375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.684397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.684421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.684439 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.786623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.786658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.786666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.786681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.786689 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.888954 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.888990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.888999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.889013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.889021 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.991730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.991790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.991806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.991855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:42 crc kubenswrapper[4922]: I0929 09:45:42.991875 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:42Z","lastTransitionTime":"2025-09-29T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.095008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.095062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.095073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.095126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.095147 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.198212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.198278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.198292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.198315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.198330 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.301155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.301211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.301229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.301253 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.301270 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.319470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.319509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.319522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.319536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.319548 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.331651 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:43Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.336195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.336230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.336241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.336257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.336268 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.348286 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:43Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.351349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.351383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.351391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.351406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.351416 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.420481 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:43Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.423394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.423423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.423431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.423444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.423455 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.436515 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:43Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.440109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.440134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.440142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.440156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.440165 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.451987 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.452129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.452129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452197 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452268 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.452330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452380 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452429 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452867 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:43Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.452982 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.457043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.457069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.457077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.457089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.457100 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.509101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.509306 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:43 crc kubenswrapper[4922]: E0929 09:45:43.509384 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:46:15.509364061 +0000 UTC m=+100.875594335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.560080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.560119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.560131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.560147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.560158 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.662904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.662950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.662961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.662979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.662991 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.765592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.765681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.765701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.765728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.765751 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.867990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.868041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.868054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.868074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.868091 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.972400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.972447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.972464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.972487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:43 crc kubenswrapper[4922]: I0929 09:45:43.972505 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:43Z","lastTransitionTime":"2025-09-29T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.075293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.075349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.075364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.075382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.075395 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.178000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.178040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.178049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.178063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.178072 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.280326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.280379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.280392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.280410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.280423 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.383240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.383390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.383412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.383436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.383454 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.486403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.486454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.486465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.486481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.486563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.588706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.588745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.588756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.588772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.588783 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.694893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.694949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.694966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.694989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.695005 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.798270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.798335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.798359 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.798393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.798411 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.901907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.901968 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.901989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.902014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.902034 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:44Z","lastTransitionTime":"2025-09-29T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.923042 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/0.log" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.923108 4922 generic.go:334] "Generic (PLEG): container finished" podID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" containerID="571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc" exitCode=1 Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.923154 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerDied","Data":"571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc"} Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.923733 4922 scope.go:117] "RemoveContainer" containerID="571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.943565 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.961686 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.975203 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:44 crc kubenswrapper[4922]: I0929 09:45:44.990589 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:44Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.005023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.005197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.005227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.005310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.005385 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.015525 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.035558 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.052746 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.068823 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.092827 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.104892 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.107893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.107955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.107970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.107992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.108362 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.122583 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.141444 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.161389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.179954 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.197234 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.212044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.212088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.212105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.212127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.212144 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.214925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.232765 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.250138 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.314925 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.314966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.314977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.314994 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.315006 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.417643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.417694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.417724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.417743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.417753 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.451536 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.451619 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:45 crc kubenswrapper[4922]: E0929 09:45:45.451647 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.451977 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:45 crc kubenswrapper[4922]: E0929 09:45:45.451984 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.452193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:45 crc kubenswrapper[4922]: E0929 09:45:45.452240 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:45 crc kubenswrapper[4922]: E0929 09:45:45.452454 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.479012 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.501099 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.520953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.521023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.521044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.521076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.521097 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.522189 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.545565 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.557951 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.573865 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.582575 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.594731 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.611084 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.623588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.623618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.623627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.623641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.623674 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.630489 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.646044 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.659082 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.675473 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.701861 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.720133 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.726089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.726168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.726191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.726224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.726250 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.741455 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.759396 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.788479 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.831048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.831141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.831164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.831196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.831220 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.928262 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/0.log" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.928377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerStarted","Data":"0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.933886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.933957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.933980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.934008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.934026 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:45Z","lastTransitionTime":"2025-09-29T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.944777 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.960041 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.980159 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:45 crc kubenswrapper[4922]: I0929 09:45:45.991689 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:45Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.002190 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.020586 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.034491 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.036370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.036412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.036423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.036444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.036461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.050767 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.062037 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.078415 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.092355 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.107579 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.121152 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.134241 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.141531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.141588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.141604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.141626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.141642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.149073 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.165055 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.175013 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.184950 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:46Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.244376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.244443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.244452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.244468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.244478 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.347317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.347376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.347387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.347406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.347417 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.449732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.449783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.449793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.449808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.449817 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.552046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.552088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.552097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.552111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.552122 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.654576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.654689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.654700 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.654716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.654724 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.757669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.757706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.757716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.757730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.757740 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.861132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.861173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.861182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.861198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.861209 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.963388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.963452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.963473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.963502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:46 crc kubenswrapper[4922]: I0929 09:45:46.963522 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:46Z","lastTransitionTime":"2025-09-29T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.065981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.066045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.066068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.066097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.066114 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.168783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.168877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.168901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.168939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.168976 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.271944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.271987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.271998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.272033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.272045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.403469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.403504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.403511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.403525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.403533 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.451269 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.451346 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.451402 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.451430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:47 crc kubenswrapper[4922]: E0929 09:45:47.451523 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:47 crc kubenswrapper[4922]: E0929 09:45:47.451620 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:47 crc kubenswrapper[4922]: E0929 09:45:47.451712 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:47 crc kubenswrapper[4922]: E0929 09:45:47.451775 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.507417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.507499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.507513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.507537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.507552 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.611311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.611388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.611416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.611454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.611476 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.715669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.715718 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.715727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.715742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.715752 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.819798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.819938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.819967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.820014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.820041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.923972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.924090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.924529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.924603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:47 crc kubenswrapper[4922]: I0929 09:45:47.924990 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:47Z","lastTransitionTime":"2025-09-29T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.028286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.028362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.028382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.028412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.028430 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.132417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.132481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.132502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.132531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.132549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.235861 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.235917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.235943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.235967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.235981 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.338888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.338939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.338952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.338970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.338983 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.441522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.441576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.441587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.441605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.441648 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.544926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.544995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.545013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.545039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.545061 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.650689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.650765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.650783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.650808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.650855 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.753212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.753260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.753270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.753287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.753298 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.856065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.856129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.856152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.856179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.856201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.958812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.958903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.958920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.958950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:48 crc kubenswrapper[4922]: I0929 09:45:48.958985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:48Z","lastTransitionTime":"2025-09-29T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.060882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.060947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.060963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.060988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.061002 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.164468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.164536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.164558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.164585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.164604 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.267586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.267674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.267696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.267729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.267750 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.370326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.370386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.370405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.370434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.370453 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.450878 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.450923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.450890 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:49 crc kubenswrapper[4922]: E0929 09:45:49.451030 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.451047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:49 crc kubenswrapper[4922]: E0929 09:45:49.451095 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:49 crc kubenswrapper[4922]: E0929 09:45:49.451155 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:49 crc kubenswrapper[4922]: E0929 09:45:49.451222 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.473361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.473422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.473439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.473526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.473546 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.576578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.576618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.576628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.576643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.576653 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.678988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.679037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.679046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.679060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.679069 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.781752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.781784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.781794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.781808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.781818 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.884385 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.884753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.884863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.884950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.885103 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.987551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.987595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.987605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.987618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:49 crc kubenswrapper[4922]: I0929 09:45:49.987628 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:49Z","lastTransitionTime":"2025-09-29T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.090797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.091137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.091324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.091476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.091616 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.195147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.195210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.195262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.195293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.195308 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.298855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.298892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.298900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.298915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.298924 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.402074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.402406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.402597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.402749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.402951 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.506880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.507214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.507327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.507440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.507534 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.610558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.610607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.610628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.610657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.610679 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.715036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.715164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.715626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.715666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.715689 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.819024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.819102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.819128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.819161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.819183 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.921856 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.921929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.921950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.921978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:50 crc kubenswrapper[4922]: I0929 09:45:50.921997 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:50Z","lastTransitionTime":"2025-09-29T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.026459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.026548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.026568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.026600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.026650 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.131580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.131787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.132199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.132377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.132556 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.235810 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.235880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.235896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.235916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.235929 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.339259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.339327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.339342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.339363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.339379 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.443802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.443888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.443904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.443928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.443953 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.450985 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.451040 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.451053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.451060 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:51 crc kubenswrapper[4922]: E0929 09:45:51.451210 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:51 crc kubenswrapper[4922]: E0929 09:45:51.451381 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:51 crc kubenswrapper[4922]: E0929 09:45:51.451541 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:51 crc kubenswrapper[4922]: E0929 09:45:51.451669 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.546661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.546704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.546715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.546733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.546745 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.650279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.650325 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.650335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.650354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.650366 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.753854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.753917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.753930 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.753957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.753970 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.857763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.858245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.858358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.858511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.858619 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.962623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.962694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.962709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.962737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:51 crc kubenswrapper[4922]: I0929 09:45:51.962758 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:51Z","lastTransitionTime":"2025-09-29T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.066916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.067393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.067564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.067721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.067916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.171134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.171185 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.171203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.171228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.171246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.274261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.275036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.275087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.275115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.275135 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.377963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.378030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.378043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.378066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.378079 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.480762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.480820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.480848 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.480872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.480888 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.583503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.583572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.583582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.583601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.583613 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.687679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.687729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.687741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.687762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.687773 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.790543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.790600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.790615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.790635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.790649 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.893101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.893151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.893166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.893186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.893205 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.996370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.996405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.996413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.996425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:52 crc kubenswrapper[4922]: I0929 09:45:52.996434 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:52Z","lastTransitionTime":"2025-09-29T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.099775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.099876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.099901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.099932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.099954 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.203208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.203274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.203291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.203316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.203335 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.306634 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.306701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.306717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.306739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.306754 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.409641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.409710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.409727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.409756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.409777 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.450952 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.451016 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.451194 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.451248 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.451030 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.451579 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.451738 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.451940 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.513261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.513309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.513326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.513349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.513369 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.600157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.600226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.600252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.600282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.600304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.624796 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:53Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.630580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.630641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.630653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.630674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.630688 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.646600 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:53Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.651600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.651644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.651653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.651675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.651721 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.668626 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:53Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.673980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.674037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.674057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.674086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.674108 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.698247 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:53Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.702766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.702822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.702866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.702893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.702911 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.722364 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:53Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:53 crc kubenswrapper[4922]: E0929 09:45:53.722681 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.724925 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.724986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.725003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.725029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.725046 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.828797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.828951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.828970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.828993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.829011 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.932316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.932443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.932460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.932482 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:53 crc kubenswrapper[4922]: I0929 09:45:53.932498 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:53Z","lastTransitionTime":"2025-09-29T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.035576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.036163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.036197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.036228 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.036246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.140338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.140412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.140433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.140462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.140481 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.243472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.243538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.243557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.243585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.243602 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.351147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.351203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.351220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.351245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.351267 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.454264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.454331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.454347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.454369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.454390 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.557570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.557624 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.557635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.557658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.557670 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.660541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.660616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.660642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.660672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.660693 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.764921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.764996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.765022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.765053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.765074 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.869244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.869331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.869356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.869390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.869412 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.972872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.972944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.972963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.972990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:54 crc kubenswrapper[4922]: I0929 09:45:54.973009 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:54Z","lastTransitionTime":"2025-09-29T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.076428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.076488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.076508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.076535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.076552 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.178977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.179046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.179064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.179091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.179108 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.282938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.283028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.283051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.283083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.283101 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.391785 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.391886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.391905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.391928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.391946 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.451472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.451508 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:55 crc kubenswrapper[4922]: E0929 09:45:55.451670 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.451743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:55 crc kubenswrapper[4922]: E0929 09:45:55.451935 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:55 crc kubenswrapper[4922]: E0929 09:45:55.452048 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.452147 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:55 crc kubenswrapper[4922]: E0929 09:45:55.452226 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.454127 4922 scope.go:117] "RemoveContainer" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.482439 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.494131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.494180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.494195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.494216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.494231 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.500436 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.521739 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.538786 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.573227 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.591533 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.597866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.597902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.597915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.597934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.597946 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.610538 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.630338 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.646142 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.663912 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.678477 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.694415 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.703476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.703512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.703523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.703538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.703549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.709812 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.722103 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.744128 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.754544 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.767153 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.777713 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.805536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.805568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.805577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.805591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.805599 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.909387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.909445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.909460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.909480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.909491 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:55Z","lastTransitionTime":"2025-09-29T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.964018 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/2.log" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.968859 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05"} Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.969308 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:45:55 crc kubenswrapper[4922]: I0929 09:45:55.995813 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:55Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.012432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.012482 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.012499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.012519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.012535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.013362 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.030479 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.053296 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.075357 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.087189 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.099457 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.110125 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.114600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.114633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.114641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.114653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.114664 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.130332 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.144584 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.153637 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.164701 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.201719 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.215926 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.216954 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.216987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.217002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.217019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.217030 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.232798 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.242497 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.260183 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.273275 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.320461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.320529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.320546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.320571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.320588 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.424009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.424077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.424094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.424118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.424139 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.527541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.527607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.527626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.527652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.527671 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.631272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.631352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.631373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.631397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.631415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.733998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.734065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.734083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.734109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.734129 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.837453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.837515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.837533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.837562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.837584 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.941134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.941182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.941198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.941220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.941237 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:56Z","lastTransitionTime":"2025-09-29T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.975273 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/3.log" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.976290 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/2.log" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.980684 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" exitCode=1 Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.980748 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05"} Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.980869 4922 scope.go:117] "RemoveContainer" containerID="9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c" Sep 29 09:45:56 crc kubenswrapper[4922]: I0929 09:45:56.984451 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:45:56 crc kubenswrapper[4922]: E0929 09:45:56.984743 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.015602 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.037259 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.043628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.043900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.044130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.044343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.044703 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.058466 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.076808 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.101512 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5074ade38e81bd6ebe0c89d84a21816ad058c5c620e6358b0718f76491578c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:25Z\\\",\\\"message\\\":\\\"ng admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:25Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:25.433604 6567 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0929 09:45:25.433591 6567 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:56Z\\\",\\\"message\\\":\\\"ould not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:56.445378 6941 services_controller.go:434] Service openshift-apiserver/check-endpoints retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{check-endpoints openshift-apiserver 5f356be5-5c32-4923-9be9-f4ede1a71efd 6150 0 2025-02-23 05:23:46 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver-check-endpoints] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc000926c67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.118945 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.142018 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.152593 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.152670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.152695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.152730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.152751 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.158814 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.175935 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.197266 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.218530 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.233180 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.250431 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.255549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.255603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.255620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.255645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.255664 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.269145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.289011 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.308303 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.324128 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.342651 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:57Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.359299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.359347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.359367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.359390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.359408 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.451177 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.451223 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:57 crc kubenswrapper[4922]: E0929 09:45:57.451431 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.451713 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:57 crc kubenswrapper[4922]: E0929 09:45:57.451815 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.452145 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:57 crc kubenswrapper[4922]: E0929 09:45:57.452225 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:57 crc kubenswrapper[4922]: E0929 09:45:57.452392 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.463229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.463294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.463312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.463355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.463372 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.568603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.568658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.568675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.568699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.568718 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.672209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.672276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.672297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.672329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.672351 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.775702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.775758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.775779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.775811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.775865 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.879172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.879210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.879221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.879238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.879250 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.982363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.982985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.983041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.983074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.983097 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:57Z","lastTransitionTime":"2025-09-29T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.987792 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/3.log" Sep 29 09:45:57 crc kubenswrapper[4922]: I0929 09:45:57.993524 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:45:57 crc kubenswrapper[4922]: E0929 09:45:57.994022 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.012503 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.035529 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.050115 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.075862 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.086326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.086386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.086400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.086419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.086429 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.087106 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.100930 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.112014 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.127464 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.142680 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.157859 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.179361 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:56Z\\\",\\\"message\\\":\\\"ould not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:56.445378 6941 services_controller.go:434] Service openshift-apiserver/check-endpoints retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{check-endpoints openshift-apiserver 5f356be5-5c32-4923-9be9-f4ede1a71efd 6150 0 2025-02-23 05:23:46 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver-check-endpoints] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc000926c67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.188605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.188652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.188664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.188684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.188697 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.203582 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.216461 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.229276 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.245400 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.259020 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.273818 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.285679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:58Z is after 2025-08-24T17:21:41Z" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.291427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.291458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.291468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.291496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.291507 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.395064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.395505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.395681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.395889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.396057 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.499438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.499505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.499528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.499556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.499578 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.603029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.603103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.603121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.603503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.603710 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.707270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.707321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.707333 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.707353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.707368 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.810886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.810939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.810956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.810982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.811001 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.914610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.914677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.914710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.914758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:58 crc kubenswrapper[4922]: I0929 09:45:58.914866 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:58Z","lastTransitionTime":"2025-09-29T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.018167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.018211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.018222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.018238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.018249 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.121257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.121320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.121338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.121361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.121376 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.224348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.224430 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.224454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.224484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.224505 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.328071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.328154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.328175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.328205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.328230 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.383599 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.383777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.383811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.383872 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:03.383812689 +0000 UTC m=+148.750043013 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.383943 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384004 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:47:03.383984605 +0000 UTC m=+148.750214879 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.383997 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.384053 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384078 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384130 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 09:47:03.384117578 +0000 UTC m=+148.750347952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384141 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384168 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384181 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384231 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 09:47:03.384213691 +0000 UTC m=+148.750443965 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384498 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384543 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384557 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.384632 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 09:47:03.384612211 +0000 UTC m=+148.750842555 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.431511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.431572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.431587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.431610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.431626 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.451261 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.451353 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.451443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.451445 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.451582 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.451645 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.451729 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:45:59 crc kubenswrapper[4922]: E0929 09:45:59.451805 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.535365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.535411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.535422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.535436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.535445 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.638073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.638139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.638156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.638181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.638198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.741577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.741623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.741633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.741648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.741658 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.845215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.845294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.845313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.845338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.845357 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.948796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.948895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.948921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.948952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:45:59 crc kubenswrapper[4922]: I0929 09:45:59.948976 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:45:59Z","lastTransitionTime":"2025-09-29T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.052051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.052104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.052116 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.052136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.052150 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.155211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.155340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.155357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.155646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.155664 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.258977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.259042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.259062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.259087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.259104 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.361661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.361704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.361715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.361732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.361744 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.464666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.464739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.464766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.464797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.464819 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.567932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.568006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.568030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.568059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.568081 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.672326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.672399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.672445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.672476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.672502 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.776411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.776474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.776486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.776507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.776521 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.879722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.879804 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.879814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.879860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.879878 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.983279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.983319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.983329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.983347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:00 crc kubenswrapper[4922]: I0929 09:46:00.983359 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:00Z","lastTransitionTime":"2025-09-29T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.088654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.089089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.089179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.089251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.089322 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.192932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.192997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.193014 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.193037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.193055 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.296497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.296555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.296572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.296596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.296617 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.399097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.399127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.399135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.399148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.399157 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.451290 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.451392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:01 crc kubenswrapper[4922]: E0929 09:46:01.451484 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.451570 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.451580 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:01 crc kubenswrapper[4922]: E0929 09:46:01.451717 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:01 crc kubenswrapper[4922]: E0929 09:46:01.451823 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:01 crc kubenswrapper[4922]: E0929 09:46:01.451960 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.502099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.502158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.502179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.502205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.502223 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.605420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.605483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.605501 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.605527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.605547 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.709124 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.709178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.709195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.709223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.709244 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.812000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.812072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.812090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.812115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.812178 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.915486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.915538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.915549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.915590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:01 crc kubenswrapper[4922]: I0929 09:46:01.915610 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:01Z","lastTransitionTime":"2025-09-29T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.018306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.018363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.018381 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.018409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.018426 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.122017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.122072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.122088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.122111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.122128 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.227806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.227918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.227939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.227964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.227981 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.331233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.331299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.331322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.331354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.331373 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.434195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.434256 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.434274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.434301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.434325 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.537857 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.538149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.538240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.538331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.538414 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.641866 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.641940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.641962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.641992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.642013 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.745208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.745249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.745261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.745277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.745290 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.848683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.849028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.849141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.849238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.849364 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.952972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.953340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.953549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.953767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:02 crc kubenswrapper[4922]: I0929 09:46:02.954025 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:02Z","lastTransitionTime":"2025-09-29T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.057976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.058041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.058062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.058089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.058107 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.161377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.161421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.161433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.161449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.161461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.265619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.265682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.265699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.265722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.265739 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.369756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.369806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.369855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.369882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.369903 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.451187 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.451224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:03 crc kubenswrapper[4922]: E0929 09:46:03.451396 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.451446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.451553 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:03 crc kubenswrapper[4922]: E0929 09:46:03.451609 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:03 crc kubenswrapper[4922]: E0929 09:46:03.451871 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:03 crc kubenswrapper[4922]: E0929 09:46:03.452011 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.473320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.473600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.473742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.473926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.474079 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.578324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.578378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.578398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.578423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.578444 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.682289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.682342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.682351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.682369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.682380 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.785739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.785807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.785824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.785879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.785898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.889440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.889548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.889565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.889590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.889608 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.992530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.992590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.992613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.992642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:03 crc kubenswrapper[4922]: I0929 09:46:03.992703 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:03Z","lastTransitionTime":"2025-09-29T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.049051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.049111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.049128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.049153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.049171 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.072165 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.078106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.078151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.078169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.078194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.078212 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.100066 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.106068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.106129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.106149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.106182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.106201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.129404 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.136600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.136715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.136737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.136765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.136784 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.159653 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.166871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.166941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.166956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.166980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.166996 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.188585 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:04Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:04 crc kubenswrapper[4922]: E0929 09:46:04.188883 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.191656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.191719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.191737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.191761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.191776 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.295365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.295427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.295441 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.295471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.295491 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.398957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.399054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.399078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.399103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.399122 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.502673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.502765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.502788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.502817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.502879 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.606036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.606158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.606183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.606207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.606224 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.708934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.709022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.709047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.709091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.709114 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.812161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.812233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.812251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.812277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.812294 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.915301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.915387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.915408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.915438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:04 crc kubenswrapper[4922]: I0929 09:46:04.915461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:04Z","lastTransitionTime":"2025-09-29T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.019354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.019442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.019458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.019480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.019494 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.122910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.122966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.122978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.122997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.123008 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.227131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.227222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.227245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.227278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.227304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.330739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.330790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.330808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.330860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.330878 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.433927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.433986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.434003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.434029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.434045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.451000 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.451053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.451054 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:05 crc kubenswrapper[4922]: E0929 09:46:05.451181 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.451212 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:05 crc kubenswrapper[4922]: E0929 09:46:05.451388 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:05 crc kubenswrapper[4922]: E0929 09:46:05.451532 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:05 crc kubenswrapper[4922]: E0929 09:46:05.451644 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.469120 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.486124 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.502501 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.522282 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.536766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.536801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.536814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.536860 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.536876 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.545631 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.569930 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.591606 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.607615 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.640959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.641119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.641145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.641170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.641189 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.641881 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:56Z\\\",\\\"message\\\":\\\"ould not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:56.445378 6941 services_controller.go:434] Service openshift-apiserver/check-endpoints retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{check-endpoints openshift-apiserver 5f356be5-5c32-4923-9be9-f4ede1a71efd 6150 0 2025-02-23 05:23:46 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver-check-endpoints] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc000926c67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.674484 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.694316 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.715354 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.732393 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.744983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.745026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.745037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.745059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.745073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.752714 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.778561 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.798320 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.813699 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.830766 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:05Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.848315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.848352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.848369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.848393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.848411 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.951151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.951218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.951241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.951270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:05 crc kubenswrapper[4922]: I0929 09:46:05.951293 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:05Z","lastTransitionTime":"2025-09-29T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.053913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.053972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.053989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.054015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.054033 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.157610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.157654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.157666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.157684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.157698 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.260226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.260598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.260684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.260763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.260867 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.364274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.364336 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.364353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.364378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.364396 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.466788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.466855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.466868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.466887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.466900 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.570142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.570329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.570409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.570448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.570529 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.698557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.699054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.699277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.699450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.699606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.803411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.803456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.803468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.803486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.803498 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.906395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.907043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.907084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.907110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:06 crc kubenswrapper[4922]: I0929 09:46:06.907124 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:06Z","lastTransitionTime":"2025-09-29T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.010044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.010102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.010119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.010143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.010162 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.113898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.113980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.113998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.114026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.114048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.217465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.217513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.217526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.217546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.217557 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.320953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.320993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.321002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.321019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.321030 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.424433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.425141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.425175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.425210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.425234 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.451428 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.451463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.451628 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:07 crc kubenswrapper[4922]: E0929 09:46:07.451746 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.451851 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:07 crc kubenswrapper[4922]: E0929 09:46:07.452076 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:07 crc kubenswrapper[4922]: E0929 09:46:07.452277 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:07 crc kubenswrapper[4922]: E0929 09:46:07.452396 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.467888 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.528295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.528338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.528351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.528368 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.528380 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.632076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.632138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.632155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.632177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.632193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.735678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.735741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.735761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.735788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.735805 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.840554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.840608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.840625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.840652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.840675 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.944189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.944250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.944268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.944297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:07 crc kubenswrapper[4922]: I0929 09:46:07.944321 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:07Z","lastTransitionTime":"2025-09-29T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.047347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.047413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.047432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.047454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.047470 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.150728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.150799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.150819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.150990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.151018 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.254197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.254276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.254298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.254325 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.254349 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.357636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.357709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.357726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.357762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.357783 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.461499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.461567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.461590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.461626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.461648 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.565980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.566059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.566080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.566112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.566132 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.670168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.670278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.670297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.670330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.670352 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.774205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.774258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.774268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.774288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.774299 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.878109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.878184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.878203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.878230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.878246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.981164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.981237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.981262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.981293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:08 crc kubenswrapper[4922]: I0929 09:46:08.981317 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:08Z","lastTransitionTime":"2025-09-29T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.084668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.084756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.084777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.084806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.084858 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.188342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.188409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.188428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.188452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.188469 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.291318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.291385 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.291403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.291427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.291444 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.394960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.395042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.395062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.395092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.395116 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.451051 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.451105 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.451610 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.451671 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:09 crc kubenswrapper[4922]: E0929 09:46:09.451906 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:09 crc kubenswrapper[4922]: E0929 09:46:09.452007 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:09 crc kubenswrapper[4922]: E0929 09:46:09.452445 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.452456 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:46:09 crc kubenswrapper[4922]: E0929 09:46:09.452582 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:09 crc kubenswrapper[4922]: E0929 09:46:09.452879 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.497932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.498000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.498018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.498041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.498059 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.601473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.601535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.601561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.601592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.601615 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.704770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.704826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.704875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.704901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.704923 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.808564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.808645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.808671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.808707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.808726 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.912370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.912435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.912453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.912479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:09 crc kubenswrapper[4922]: I0929 09:46:09.912496 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:09Z","lastTransitionTime":"2025-09-29T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.016581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.016668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.016694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.016742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.016769 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.121319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.121426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.121450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.121949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.122275 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.225653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.225705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.225722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.225745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.225763 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.329012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.329082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.329102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.329126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.329144 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.432212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.432281 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.432304 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.432334 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.432356 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.536702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.536775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.536797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.536868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.536897 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.639631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.639685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.639706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.639730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.639747 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.742038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.742093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.742112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.742136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.742152 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.844742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.844790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.844806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.844854 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.844872 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.948020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.948059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.948070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.948086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:10 crc kubenswrapper[4922]: I0929 09:46:10.948098 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:10Z","lastTransitionTime":"2025-09-29T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.050542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.050586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.050604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.050628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.050646 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.153767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.153857 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.153878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.153906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.153927 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.256495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.256532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.256542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.256561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.256571 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.359743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.359797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.359815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.359867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.359885 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.450889 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:11 crc kubenswrapper[4922]: E0929 09:46:11.451055 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.451297 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:11 crc kubenswrapper[4922]: E0929 09:46:11.451400 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.452590 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:11 crc kubenswrapper[4922]: E0929 09:46:11.452712 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.452946 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:11 crc kubenswrapper[4922]: E0929 09:46:11.453109 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.462205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.462245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.462263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.462285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.462302 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.565044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.565108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.565126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.565220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.565297 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.669584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.669629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.669641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.669661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.669672 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.772395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.772456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.772466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.772485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.772497 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.876072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.876164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.876193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.876226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.876251 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.979703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.979795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.979822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.979916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:11 crc kubenswrapper[4922]: I0929 09:46:11.979943 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:11Z","lastTransitionTime":"2025-09-29T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.083353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.083397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.083407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.083449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.083463 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.186331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.186386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.186402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.186422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.186436 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.289956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.290022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.290032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.290048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.290060 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.394393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.394539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.394558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.394582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.394603 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.498417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.498476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.498548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.498627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.498648 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.602131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.602178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.602190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.602206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.602217 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.705474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.705536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.705554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.705579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.705598 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.809201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.809268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.809291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.809324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.809347 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.911761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.911806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.911824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.911871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:12 crc kubenswrapper[4922]: I0929 09:46:12.911889 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:12Z","lastTransitionTime":"2025-09-29T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.015357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.015573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.015597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.015622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.015638 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.118976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.119041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.119057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.119082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.119099 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.222365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.222420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.222436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.222460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.222479 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.325085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.325160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.325179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.325203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.325223 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.427590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.427662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.427685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.427715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.427736 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.451688 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:13 crc kubenswrapper[4922]: E0929 09:46:13.451912 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.452127 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.452188 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:13 crc kubenswrapper[4922]: E0929 09:46:13.452299 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.452368 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:13 crc kubenswrapper[4922]: E0929 09:46:13.452603 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:13 crc kubenswrapper[4922]: E0929 09:46:13.452790 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.529697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.529740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.529749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.529768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.529778 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.632533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.632597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.632616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.632641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.632657 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.735450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.735514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.735532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.735554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.735571 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.838507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.838565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.838582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.838611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.838631 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.941492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.941536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.941547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.941564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:13 crc kubenswrapper[4922]: I0929 09:46:13.941577 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:13Z","lastTransitionTime":"2025-09-29T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.045195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.045318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.045342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.045423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.045442 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.148106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.148167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.148187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.148232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.148246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.251807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.251912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.251927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.251946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.251958 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.355522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.355606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.355630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.355659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.355678 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.458867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.458935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.458952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.458977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.458994 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.476480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.476585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.476607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.476640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.476660 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.498211 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.503608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.503661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.503676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.503699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.503714 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.524178 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.529921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.530012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.530034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.530066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.530088 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.553084 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.557926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.558009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.558064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.558089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.558107 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.578611 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.583085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.583136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.583155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.583179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.583198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.598487 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:14Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:14 crc kubenswrapper[4922]: E0929 09:46:14.598642 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.600911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.601003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.601021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.601046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.601063 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.703711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.703767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.703784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.703808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.703824 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.807439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.807509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.807541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.807572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.807597 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.909862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.909926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.909943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.909973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:14 crc kubenswrapper[4922]: I0929 09:46:14.909997 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:14Z","lastTransitionTime":"2025-09-29T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.013175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.013243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.013263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.013292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.013309 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.117373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.117420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.117431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.117451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.117463 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.220739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.220808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.220855 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.220884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.220902 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.324858 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.324941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.324965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.324996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.325022 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.428367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.428413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.428422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.428438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.428448 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.451284 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.451372 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.451322 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.451536 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.451570 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.451796 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.452061 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.452171 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.474418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a283de6209601eed9ae42a9d3f5f0c9b71c2ae4b1aa789003b2855ee635c9e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.500776 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-66xg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c029143-44a6-410b-8496-24f92c58bb8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b30a42151ab2078e5c865f18c77f4eee62268a129b21b1599a4a93bd5b8017cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72c568dd8a2c00611823c421d86a7cfd8edb4d7ea7dd48155e0dd3f19940935\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b51d417ea193078d7b701c914698e6b9cfea3d5b575bfa3199617dfb1ca092e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6ad4f0f507145a9fcfa327cdb6b7c0e6b0af1828b848dd5bbd595fa956c8b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c6689759d03d265243c2c6ebfd4251cad5eb8006fb445b743c531d355524dea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30afca3591ac4b0d0e6383ddd0ca127c9cf3d1dc819c71c11cdc9a78dea4169f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1567cc67580183f65344b215a920d3a2f302befc9e3b953139925c379ac43edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2gcb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-66xg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.523880 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.531355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.531407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.531427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.531456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.531476 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.542797 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57ce43a3-0e3a-4abb-a79c-f202bc3d44c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4360e9c68d925f807b7768fd6fc1eccc5b7ee3ab8664cb3783ad84d9b01e55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.564045 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbd1c2972a531c24c26743673d37e87f61249cc672ec89b19d32a7776984b2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42c1887a8b4b22be4ae7fae427315344ce31b10c7020aa414b1bdecc44e26bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.579663 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.580006 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:46:15 crc kubenswrapper[4922]: E0929 09:46:15.580225 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs podName:48a99f27-a7b4-466d-b130-026774744f7d nodeName:}" failed. No retries permitted until 2025-09-29 09:47:19.580161806 +0000 UTC m=+164.946392240 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs") pod "network-metrics-daemon-9p9s8" (UID: "48a99f27-a7b4-466d-b130-026774744f7d") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.581187 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.598999 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.615070 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h9kvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0358d9bd-7f9c-49c4-9690-ee1fee839c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f85e25747eb09849ed3a1458844632422f1fd5076a880490bd4a3f36d7db9eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cghjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h9kvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.630378 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a99f27-a7b4-466d-b130-026774744f7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jsvm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9p9s8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.634296 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.634355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.634372 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.634398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.634415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.641985 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f01653-0511-4f73-ade6-c1d7f351e3e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce0c2640c9913232152cb8ebeaa8c4ade68214517534796b2d4e0b84bd95cab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e85e4a51e270ed1958f5a6a1c9ed0f8e18dccf1256057489a1d092bde86c4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh7wj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:45:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-b7k8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.655452 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c6a93f-9138-4722-928c-a844ccafbd14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c039e31de217f49d4de39241f7f8a0a070e76f16c0d41c4bf2f89e153bff8846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66050092f4859f828461450e0534dc58d358e553f8d2a8f4523b76ed009fc7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c793c6956a1e6996d22fe571a62e54190beee5a691be86da917eb81fb37f512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19858ecffc16ba61b310163fc901aa95c6821f3b7c5c1d94b7fb16bc96941c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.669081 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207ccf3e-83c1-4a41-a565-3b8cad0b8bb7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaee02dfd308286c1946806725d683d12fb4f53c92b9640477f2d986fda5157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b50cf90502a724e59d21822db14dbba8e467f9a91b26cdbfa026e715ffd56ba3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a63bd48948f78192b37766434a9ff79519bbfa9a1f3484623dec4b264bd265\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b9b71641c9ae83caa4fd7d52af9519abac3b5c0597c041f0addb4cda742e19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e490c82afd4539dec2a07cae7c704a28f4f75ef683c6474a55028168e318c845\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T09:44:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 09:44:49.139204 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 09:44:49.140564 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-159949335/tls.crt::/tmp/serving-cert-159949335/tls.key\\\\\\\"\\\\nI0929 09:44:55.515404 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 09:44:55.521549 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 09:44:55.521603 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 09:44:55.521685 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 09:44:55.521698 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 09:44:55.535382 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 09:44:55.535406 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535417 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 09:44:55.535422 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0929 09:44:55.535425 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 09:44:55.535429 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 09:44:55.535431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0929 09:44:55.535594 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0929 09:44:55.542817 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e9fd58dc79e480f098dd6733db8cb835d13db38c130ab617308e4e6e2b1562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8e2e7ace2aed8547c436e0732baca535786085349f8a99da9ec0bee29cd9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.682053 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfd7083baf1f0fcfdc8548fc54c913722e5b14e2a4de303ca7e0c7ff097cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.692772 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dbcdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a512521c-5cca-4e12-8e5f-97ba1b42b325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb4ce9af97aab27825f2c5671decd9c1b4bcf44a7501d06a1920f2918a71055f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-864fc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dbcdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.717488 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee08d9f2-f100-4598-8ab3-5198a21b08f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:56Z\\\",\\\"message\\\":\\\"ould not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0929 09:45:56.445378 6941 services_controller.go:434] Service openshift-apiserver/check-endpoints retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{check-endpoints openshift-apiserver 5f356be5-5c32-4923-9be9-f4ede1a71efd 6150 0 2025-02-23 05:23:46 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver-check-endpoints] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc000926c67 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:45:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tr9bt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.738899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.738983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.738792 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"638f9737-0321-4c50-a33d-8759c128596e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239b8e9df7169405a982daa8ad5a1012348430c1d6e0de5f8ff77efef5d8e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a77f2d35607d650a51a985cf012fab5bad54a618735c61702b39bdae188ae7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://546dd91c0eaadc1b4efae004e2409dcc5f8562094878fc914d840e01454205e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77c4608e58f3e9d251fc3cefab4b5dbac08005f84b5ca58bf892b2f4f907abd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1751f66da428104e7a767f22d58d7263bc7cd46457a68d5b527a984f6464e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2fafd5df648f55a33db8b9d2f725a3d3b25aa8fca1fac89607d87390f62f91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78a44d128933a03309d4b79f9ea7a47cc88bdae75c6a8ab05c3895373dd86adb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61598b860753b4cb7f31701efb8d7f959b66a2179d038db30c28871b29c08616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.739004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.739266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.739290 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.756713 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.771105 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h6dfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dc69012-4e4c-437b-82d8-9d04e2e22e58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T09:45:44Z\\\",\\\"message\\\":\\\"2025-09-29T09:44:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8\\\\n2025-09-29T09:44:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_997ee5fa-3d5b-4c8c-8eaf-671f64d97ea8 to /host/opt/cni/bin/\\\\n2025-09-29T09:44:59Z [verbose] multus-daemon started\\\\n2025-09-29T09:44:59Z [verbose] Readiness Indicator file check\\\\n2025-09-29T09:45:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hstnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h6dfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.786421 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18583652-9871-4fba-93c8-9f86e9f57622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38cf8487a7f6059d2a59cfc7cd484b2ae96b11b1e0495af04c98c9405ffd5a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7p2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kgzgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:15Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.841280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.841320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.841332 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.841352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.841365 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.944670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.944728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.944744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.944772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:15 crc kubenswrapper[4922]: I0929 09:46:15.944790 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:15Z","lastTransitionTime":"2025-09-29T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.047798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.047919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.047943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.047969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.047986 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.150614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.150669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.150685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.150705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.150719 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.253440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.253513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.253533 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.253562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.253581 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.356115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.356180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.356202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.356226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.356301 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.459587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.459651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.459671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.459696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.459713 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.563345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.563405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.563420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.563446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.563461 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.666297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.666352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.666371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.666397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.666414 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.769545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.769600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.769617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.769641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.769658 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.872497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.872569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.872590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.872620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.872642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.976089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.976483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.976981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.977294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:16 crc kubenswrapper[4922]: I0929 09:46:16.977730 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:16Z","lastTransitionTime":"2025-09-29T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.080283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.081074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.081569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.081734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.082045 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.186166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.186231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.186250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.186278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.186296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.290043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.290100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.290118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.290142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.290159 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.408715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.408750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.408758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.408771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.408779 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.451295 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.451391 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.451392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.451318 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:17 crc kubenswrapper[4922]: E0929 09:46:17.451490 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:17 crc kubenswrapper[4922]: E0929 09:46:17.451651 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:17 crc kubenswrapper[4922]: E0929 09:46:17.451743 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:17 crc kubenswrapper[4922]: E0929 09:46:17.451896 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.512110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.512515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.512617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.512717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.512804 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.615985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.616047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.616064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.616092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.616110 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.718904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.719492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.719626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.719806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.720048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.823571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.823626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.823643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.823669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.823691 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.926896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.926957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.926971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.926992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:17 crc kubenswrapper[4922]: I0929 09:46:17.927011 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:17Z","lastTransitionTime":"2025-09-29T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.030082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.030150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.030168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.030193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.030210 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.133528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.133592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.133612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.133635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.133657 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.236969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.237016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.237032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.237050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.237063 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.339584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.339658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.339692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.339722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.339807 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.443566 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.443633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.443651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.443674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.443690 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.546094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.546152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.546161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.546179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.546189 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.649169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.649229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.649248 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.649275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.649294 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.752388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.752458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.752482 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.752507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.752525 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.855792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.855890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.855920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.855948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.855967 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.958923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.959000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.959025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.959055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:18 crc kubenswrapper[4922]: I0929 09:46:18.959077 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:18Z","lastTransitionTime":"2025-09-29T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.062686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.062751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.062768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.062794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.062812 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.166371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.166431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.166448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.166474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.166492 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.269214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.269264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.269288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.269312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.269329 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.372937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.372998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.373015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.373038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.373057 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.451144 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.451294 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.451410 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:19 crc kubenswrapper[4922]: E0929 09:46:19.451401 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:19 crc kubenswrapper[4922]: E0929 09:46:19.451542 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.451641 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:19 crc kubenswrapper[4922]: E0929 09:46:19.451767 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:19 crc kubenswrapper[4922]: E0929 09:46:19.451930 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.475493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.475549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.475561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.475579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.475592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.578896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.578940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.578950 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.578968 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.578979 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.682093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.682186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.682210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.682238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.682257 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.785481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.785518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.785553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.785567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.785576 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.888515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.888579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.888596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.888623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.888644 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.992060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.992123 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.992140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.992169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:19 crc kubenswrapper[4922]: I0929 09:46:19.992187 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:19Z","lastTransitionTime":"2025-09-29T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.095146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.095214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.095226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.095257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.095272 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.198930 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.199023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.199044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.199078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.199102 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.302487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.302553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.302575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.302603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.302622 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.406414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.406480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.406498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.406523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.406542 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.509964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.510046 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.510070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.510102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.510122 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.613705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.613774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.613794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.613820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.613887 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.718147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.718217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.718234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.718259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.718277 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.821619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.821667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.821683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.821705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.821721 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.924313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.924388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.924409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.924438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:20 crc kubenswrapper[4922]: I0929 09:46:20.924460 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:20Z","lastTransitionTime":"2025-09-29T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.027340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.027446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.027460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.027480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.027492 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.130877 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.130987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.131009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.131083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.131107 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.234686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.234728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.234738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.234757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.234768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.337951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.338016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.338031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.338055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.338072 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.441526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.441570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.441580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.441599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.441609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.451100 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.451147 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:21 crc kubenswrapper[4922]: E0929 09:46:21.451241 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.451100 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:21 crc kubenswrapper[4922]: E0929 09:46:21.451305 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.451459 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:21 crc kubenswrapper[4922]: E0929 09:46:21.451635 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:21 crc kubenswrapper[4922]: E0929 09:46:21.451821 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.544641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.544728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.544746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.544771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.544786 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.647626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.647681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.647694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.647715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.647728 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.750531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.750584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.750602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.750623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.750642 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.852980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.853024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.853040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.853063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.853080 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.956271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.956348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.956373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.956402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:21 crc kubenswrapper[4922]: I0929 09:46:21.956421 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:21Z","lastTransitionTime":"2025-09-29T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.059403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.059466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.059483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.059510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.059528 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.163439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.163507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.163524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.163548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.163565 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.268407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.268469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.268486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.268511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.268533 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.380879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.380937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.380956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.380981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.381004 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.483739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.484163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.484387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.484558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.484720 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.588344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.588404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.588421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.588445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.588463 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.691544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.692442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.692722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.692848 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.692980 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.795662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.795725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.795742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.795768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.795786 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.899119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.899180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.899199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.899224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:22 crc kubenswrapper[4922]: I0929 09:46:22.899247 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:22Z","lastTransitionTime":"2025-09-29T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.002662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.002746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.002763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.002786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.002803 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.105578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.105638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.105706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.105732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.105751 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.209398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.209467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.209487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.209513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.209533 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.312910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.313005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.313036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.313067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.313090 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.416594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.416646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.416669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.416692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.416711 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.451376 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.451423 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.451429 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.451376 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:23 crc kubenswrapper[4922]: E0929 09:46:23.451523 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:23 crc kubenswrapper[4922]: E0929 09:46:23.451671 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:23 crc kubenswrapper[4922]: E0929 09:46:23.451781 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:23 crc kubenswrapper[4922]: E0929 09:46:23.451860 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.452528 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:46:23 crc kubenswrapper[4922]: E0929 09:46:23.452678 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.519805 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.519893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.519911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.519932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.519946 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.623456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.623519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.623536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.623560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.623580 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.727205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.727324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.727360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.727394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.727415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.830707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.830772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.830790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.830814 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.830864 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.933931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.934028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.934047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.934071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:23 crc kubenswrapper[4922]: I0929 09:46:23.934123 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:23Z","lastTransitionTime":"2025-09-29T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.036659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.036735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.036762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.036792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.036815 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.139952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.140019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.140041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.140074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.140097 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.243512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.243571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.243590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.243612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.243631 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.346583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.346664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.346690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.346724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.346746 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.449662 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.449826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.449937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.449966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.449987 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.553819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.553920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.553938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.553964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.553980 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.657598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.657668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.657686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.657711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.657733 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.686437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.686500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.686519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.686545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.686563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.708312 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.714080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.714146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.714164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.714193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.714213 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.735632 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.740476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.740536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.740553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.740578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.740596 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.757075 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.761799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.761871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.761887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.761907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.761920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.782406 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.786737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.786793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.786811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.786870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.786907 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.808358 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T09:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f2e3be9-45e6-42ad-bd2f-b7aae5c969bd\\\",\\\"systemUUID\\\":\\\"33c12f62-5b5f-4d4e-9af7-92ce6ab7df30\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:24Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:24 crc kubenswrapper[4922]: E0929 09:46:24.808534 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.810080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.810130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.810150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.810176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.810193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.913258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.913322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.913339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.913364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:24 crc kubenswrapper[4922]: I0929 09:46:24.913382 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:24Z","lastTransitionTime":"2025-09-29T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.017043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.017126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.017151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.017182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.017207 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.119611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.119680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.119698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.119725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.119768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.222406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.222470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.222494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.222524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.222549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.326381 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.326460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.326481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.326509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.326528 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.430193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.430261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.430285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.430315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.430338 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.450923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.450986 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.450934 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:25 crc kubenswrapper[4922]: E0929 09:46:25.451192 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.451259 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:25 crc kubenswrapper[4922]: E0929 09:46:25.451545 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:25 crc kubenswrapper[4922]: E0929 09:46:25.452032 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:25 crc kubenswrapper[4922]: E0929 09:46:25.452173 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.476361 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b506724-4809-4b8f-ab2c-acd4659dd474\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f3cdab7e08e5e1a324fc494d9b29809ab830752a866780838647471009aef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f10449ad681ea5f4e143518b7c24c92e79e3a35d92567c7f4bb6e79c1a00e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://156951963fa6e3260b950470648b7ad986a8b81dbbf1b2a56012c12013ce1990\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb229b4f18663519e0179587573c850c33657f01ebe7101c11d72e64cb70abe1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.492799 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57ce43a3-0e3a-4abb-a79c-f202bc3d44c7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4360e9c68d925f807b7768fd6fc1eccc5b7ee3ab8664cb3783ad84d9b01e55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T09:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7dab15c98b18ceef390b3befae924ef660b642c4443a0a33053c4749ea39f778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T09:44:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T09:44:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T09:44:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T09:46:25Z is after 2025-08-24T17:21:41Z" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.534476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.534529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.534547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.534572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.534589 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.607504 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-66xg9" podStartSLOduration=90.607478589 podStartE2EDuration="1m30.607478589s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.607368606 +0000 UTC m=+110.973598930" watchObservedRunningTime="2025-09-29 09:46:25.607478589 +0000 UTC m=+110.973708883" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.638878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.638932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.638948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.638968 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.638985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.649063 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h9kvt" podStartSLOduration=89.649031965 podStartE2EDuration="1m29.649031965s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.648881611 +0000 UTC m=+111.015111885" watchObservedRunningTime="2025-09-29 09:46:25.649031965 +0000 UTC m=+111.015262269" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.699747 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.699730569 podStartE2EDuration="54.699730569s" podCreationTimestamp="2025-09-29 09:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.678793066 +0000 UTC m=+111.045023370" watchObservedRunningTime="2025-09-29 09:46:25.699730569 +0000 UTC m=+111.065960833" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.717578 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.717539328 podStartE2EDuration="1m29.717539328s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.700695205 +0000 UTC m=+111.066925489" watchObservedRunningTime="2025-09-29 09:46:25.717539328 +0000 UTC m=+111.083769602" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.729471 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dbcdn" podStartSLOduration=90.729442008 podStartE2EDuration="1m30.729442008s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.728637426 +0000 UTC m=+111.094867700" watchObservedRunningTime="2025-09-29 09:46:25.729442008 +0000 UTC m=+111.095672282" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.742070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.742155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.742176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.742205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.742225 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.749782 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-b7k8h" podStartSLOduration=89.749754334 podStartE2EDuration="1m29.749754334s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.749501847 +0000 UTC m=+111.115732161" watchObservedRunningTime="2025-09-29 09:46:25.749754334 +0000 UTC m=+111.115984608" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.791663 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=92.79163579 podStartE2EDuration="1m32.79163579s" podCreationTimestamp="2025-09-29 09:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.790021587 +0000 UTC m=+111.156251861" watchObservedRunningTime="2025-09-29 09:46:25.79163579 +0000 UTC m=+111.157866054" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.821089 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h6dfk" podStartSLOduration=90.821066542 podStartE2EDuration="1m30.821066542s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.820650991 +0000 UTC m=+111.186881265" watchObservedRunningTime="2025-09-29 09:46:25.821066542 +0000 UTC m=+111.187296816" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.831707 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podStartSLOduration=90.831689648 podStartE2EDuration="1m30.831689648s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:25.831228986 +0000 UTC m=+111.197459250" watchObservedRunningTime="2025-09-29 09:46:25.831689648 +0000 UTC m=+111.197919912" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.845426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.845463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.845473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.845487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.845497 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.948395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.948467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.948486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.948511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:25 crc kubenswrapper[4922]: I0929 09:46:25.948529 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:25Z","lastTransitionTime":"2025-09-29T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.051688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.051775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.051800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.051889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.051920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.154298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.154358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.154377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.154402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.154450 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.257990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.258037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.258048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.258066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.258100 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.362616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.362735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.362755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.362783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.362804 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.465334 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.465383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.465398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.465416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.465429 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.568202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.568250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.568264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.568288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.568303 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.670980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.671049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.671129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.671155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.671172 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.774710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.774769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.774786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.774810 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.774827 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.877119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.877181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.877201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.877226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.877243 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.981407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.981474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.981491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.981517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:26 crc kubenswrapper[4922]: I0929 09:46:26.981533 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:26Z","lastTransitionTime":"2025-09-29T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.084512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.084583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.084608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.084638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.084660 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.188012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.188097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.188127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.188161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.188186 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.291287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.291479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.291502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.291558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.291576 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.395029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.395085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.395103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.395126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.395144 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.451326 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.451343 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.451442 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.451723 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:27 crc kubenswrapper[4922]: E0929 09:46:27.451958 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:27 crc kubenswrapper[4922]: E0929 09:46:27.452197 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:27 crc kubenswrapper[4922]: E0929 09:46:27.452287 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:27 crc kubenswrapper[4922]: E0929 09:46:27.452353 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.498937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.498988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.499009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.499037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.499059 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.602536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.602592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.602665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.602695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.602718 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.706918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.706977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.707002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.707028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.707049 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.810934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.810992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.811018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.811050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.811071 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.915171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.915231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.915249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.915274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:27 crc kubenswrapper[4922]: I0929 09:46:27.915290 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:27Z","lastTransitionTime":"2025-09-29T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.018391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.018440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.018454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.018474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.018490 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.121367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.121419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.121435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.121459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.121475 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.225026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.225085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.225101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.225124 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.225141 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.328067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.328135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.328162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.328187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.328204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.431564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.431626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.431644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.431668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.431687 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.534871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.534972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.534996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.535023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.535040 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.638211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.638288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.638311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.638341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.638363 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.741380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.741467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.741503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.741535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.741557 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.844082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.844131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.844142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.844159 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.844170 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.947470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.947537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.947554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.947579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:28 crc kubenswrapper[4922]: I0929 09:46:28.947601 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:28Z","lastTransitionTime":"2025-09-29T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.051120 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.051167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.051185 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.051207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.051224 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.153823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.153895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.153908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.153928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.153943 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.257485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.257545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.257558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.257581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.257592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.361478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.361582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.361605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.361630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.361652 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.451591 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.451591 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.451721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.451737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:29 crc kubenswrapper[4922]: E0929 09:46:29.451855 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:29 crc kubenswrapper[4922]: E0929 09:46:29.452008 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:29 crc kubenswrapper[4922]: E0929 09:46:29.452120 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:29 crc kubenswrapper[4922]: E0929 09:46:29.452216 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.464378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.464425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.464443 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.464465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.464483 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.568099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.568189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.568209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.568237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.568255 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.672327 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.672400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.672420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.672453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.672476 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.776548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.776626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.776648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.776679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.776703 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.880517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.880587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.880607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.880636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.880656 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.983407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.983470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.983487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.983515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:29 crc kubenswrapper[4922]: I0929 09:46:29.983533 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:29Z","lastTransitionTime":"2025-09-29T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.087539 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.087597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.087612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.087637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.087658 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.191007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.191090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.191119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.191161 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.191183 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.294107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.294186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.294199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.294225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.294240 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.397449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.397507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.397520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.397542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.397556 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.500494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.500559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.500580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.500608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.500625 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.603535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.603595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.603611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.603635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.603654 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.706138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.706193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.706207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.706229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.706243 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.809755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.809812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.809844 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.809865 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.809878 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.913850 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.913900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.913912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.913948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:30 crc kubenswrapper[4922]: I0929 09:46:30.913963 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:30Z","lastTransitionTime":"2025-09-29T09:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.017480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.017545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.017558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.017579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.017591 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.120536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.120584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.120597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.120612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.120624 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.143639 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/1.log" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.144550 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/0.log" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.144617 4922 generic.go:334] "Generic (PLEG): container finished" podID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" containerID="0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273" exitCode=1 Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.144668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerDied","Data":"0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.144720 4922 scope.go:117] "RemoveContainer" containerID="571e4b396495b9c70edd6b7b648f9bddea0c80556c7f53bebcab08ad4e6403fc" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.145378 4922 scope.go:117] "RemoveContainer" containerID="0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273" Sep 29 09:46:31 crc kubenswrapper[4922]: E0929 09:46:31.145789 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-h6dfk_openshift-multus(7dc69012-4e4c-437b-82d8-9d04e2e22e58)\"" pod="openshift-multus/multus-h6dfk" podUID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.176205 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=95.176113299 podStartE2EDuration="1m35.176113299s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:31.173089728 +0000 UTC m=+116.539320052" watchObservedRunningTime="2025-09-29 09:46:31.176113299 +0000 UTC m=+116.542343593" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.190311 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.19028176 podStartE2EDuration="24.19028176s" podCreationTimestamp="2025-09-29 09:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:31.18917686 +0000 UTC m=+116.555407124" watchObservedRunningTime="2025-09-29 09:46:31.19028176 +0000 UTC m=+116.556512064" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.224215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.224270 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.224289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.224311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.224329 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.327529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.327572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.327580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.327597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.327606 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.431399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.431470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.431515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.431549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.431572 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.451742 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.451879 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.451978 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:31 crc kubenswrapper[4922]: E0929 09:46:31.451965 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:31 crc kubenswrapper[4922]: E0929 09:46:31.452192 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.452210 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:31 crc kubenswrapper[4922]: E0929 09:46:31.452303 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:31 crc kubenswrapper[4922]: E0929 09:46:31.452484 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.541131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.541192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.541211 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.541236 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.541255 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.644424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.644505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.644525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.644582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.644601 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.748481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.748540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.748556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.748586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.748608 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.852030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.852095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.852118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.852146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.852168 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.955060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.955103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.955121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.955142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:31 crc kubenswrapper[4922]: I0929 09:46:31.955155 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:31Z","lastTransitionTime":"2025-09-29T09:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.057719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.057761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.057768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.057782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.057790 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.151541 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/1.log" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.159640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.159734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.159755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.159780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.159798 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.262459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.262529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.262541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.262560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.262594 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.366260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.366337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.366357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.366383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.366400 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.470201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.470277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.470295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.470321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.470339 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.573686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.573755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.573777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.573802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.573820 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.676608 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.676678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.676696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.676719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.676736 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.780266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.780343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.780363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.780386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.780404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.883698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.883767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.883781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.883803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.883816 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.986927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.987006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.987029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.987060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:32 crc kubenswrapper[4922]: I0929 09:46:32.987077 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:32Z","lastTransitionTime":"2025-09-29T09:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.090536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.090596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.090619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.090649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.090672 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.193918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.193970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.193983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.194004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.194017 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.297328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.297387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.297404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.297430 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.297447 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.401900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.401942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.401956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.401975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.401991 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.451206 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.451258 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.451272 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:33 crc kubenswrapper[4922]: E0929 09:46:33.451428 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.451707 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:33 crc kubenswrapper[4922]: E0929 09:46:33.452069 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:33 crc kubenswrapper[4922]: E0929 09:46:33.452183 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:33 crc kubenswrapper[4922]: E0929 09:46:33.452293 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.505186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.505245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.505259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.505279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.505294 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.608492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.608557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.608574 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.608598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.608616 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.712316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.713312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.713476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.713627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.713778 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.816570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.816631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.816653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.816681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.816701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.918791 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.918824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.918871 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.918887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:33 crc kubenswrapper[4922]: I0929 09:46:33.918898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:33Z","lastTransitionTime":"2025-09-29T09:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.021471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.021581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.021599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.021628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.021647 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.124622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.124883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.124910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.124935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.124952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.227864 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.227938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.227961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.227991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.228014 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.331396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.331470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.331489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.331515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.331532 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.434609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.434674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.434691 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.434717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.434736 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.452421 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:46:34 crc kubenswrapper[4922]: E0929 09:46:34.452643 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tr9bt_openshift-ovn-kubernetes(ee08d9f2-f100-4598-8ab3-5198a21b08f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.542610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.542676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.542689 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.542725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.542741 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.645702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.645761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.645780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.645803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.645820 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.749152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.750073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.750204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.750360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.750532 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.853745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.853806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.853868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.853902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.853925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.956462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.956778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.956959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.957108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.957245 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.968022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.968217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.968367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.968512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 09:46:34 crc kubenswrapper[4922]: I0929 09:46:34.968653 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T09:46:34Z","lastTransitionTime":"2025-09-29T09:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.033281 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7"] Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.034799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.037604 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.038225 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.039347 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.039730 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.111929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.112050 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.112121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.112151 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.112275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.213756 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.213878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.213949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.214003 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.214034 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.214097 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.214128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.215960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.229892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.236440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e87dc5f-c7be-449b-b98a-35932ecdd6bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ql2z7\" (UID: \"8e87dc5f-c7be-449b-b98a-35932ecdd6bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.360197 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.450904 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.451018 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.452287 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.452327 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:35 crc kubenswrapper[4922]: I0929 09:46:35.452383 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.452549 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.452757 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.452955 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.472644 4922 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 29 09:46:35 crc kubenswrapper[4922]: E0929 09:46:35.547598 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:46:36 crc kubenswrapper[4922]: I0929 09:46:36.175562 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" event={"ID":"8e87dc5f-c7be-449b-b98a-35932ecdd6bc","Type":"ContainerStarted","Data":"746db5ce677e8014d9c7b583dff740c33a61c4c6dc7761511c5bcdabdb4907c4"} Sep 29 09:46:36 crc kubenswrapper[4922]: I0929 09:46:36.175650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" event={"ID":"8e87dc5f-c7be-449b-b98a-35932ecdd6bc","Type":"ContainerStarted","Data":"71bd04e2db2016733101b3d78179cc3b961a6a43dc16f926091cdbc6b4b86e57"} Sep 29 09:46:36 crc kubenswrapper[4922]: I0929 09:46:36.192957 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ql2z7" podStartSLOduration=101.192936279 podStartE2EDuration="1m41.192936279s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:36.192720744 +0000 UTC m=+121.558951048" watchObservedRunningTime="2025-09-29 09:46:36.192936279 +0000 UTC m=+121.559166543" Sep 29 09:46:37 crc kubenswrapper[4922]: I0929 09:46:37.451355 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:37 crc kubenswrapper[4922]: I0929 09:46:37.451374 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:37 crc kubenswrapper[4922]: E0929 09:46:37.451608 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:37 crc kubenswrapper[4922]: E0929 09:46:37.451894 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:37 crc kubenswrapper[4922]: I0929 09:46:37.451956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:37 crc kubenswrapper[4922]: I0929 09:46:37.451960 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:37 crc kubenswrapper[4922]: E0929 09:46:37.452154 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:37 crc kubenswrapper[4922]: E0929 09:46:37.452344 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:39 crc kubenswrapper[4922]: I0929 09:46:39.451768 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:39 crc kubenswrapper[4922]: I0929 09:46:39.451953 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:39 crc kubenswrapper[4922]: I0929 09:46:39.451998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:39 crc kubenswrapper[4922]: I0929 09:46:39.452071 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:39 crc kubenswrapper[4922]: E0929 09:46:39.453290 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:39 crc kubenswrapper[4922]: E0929 09:46:39.453049 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:39 crc kubenswrapper[4922]: E0929 09:46:39.452857 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:39 crc kubenswrapper[4922]: E0929 09:46:39.453471 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:40 crc kubenswrapper[4922]: E0929 09:46:40.549203 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:46:41 crc kubenswrapper[4922]: I0929 09:46:41.451889 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:41 crc kubenswrapper[4922]: I0929 09:46:41.452018 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:41 crc kubenswrapper[4922]: E0929 09:46:41.452096 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:41 crc kubenswrapper[4922]: E0929 09:46:41.452280 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:41 crc kubenswrapper[4922]: I0929 09:46:41.452312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:41 crc kubenswrapper[4922]: E0929 09:46:41.452502 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:41 crc kubenswrapper[4922]: I0929 09:46:41.453063 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:41 crc kubenswrapper[4922]: E0929 09:46:41.453210 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:43 crc kubenswrapper[4922]: I0929 09:46:43.450955 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:43 crc kubenswrapper[4922]: I0929 09:46:43.451090 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:43 crc kubenswrapper[4922]: I0929 09:46:43.451116 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:43 crc kubenswrapper[4922]: I0929 09:46:43.451140 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:43 crc kubenswrapper[4922]: E0929 09:46:43.451735 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:43 crc kubenswrapper[4922]: E0929 09:46:43.451867 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:43 crc kubenswrapper[4922]: E0929 09:46:43.451624 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:43 crc kubenswrapper[4922]: E0929 09:46:43.452068 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:45 crc kubenswrapper[4922]: I0929 09:46:45.450913 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:45 crc kubenswrapper[4922]: I0929 09:46:45.452822 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:45 crc kubenswrapper[4922]: I0929 09:46:45.452922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:45 crc kubenswrapper[4922]: I0929 09:46:45.453019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:45 crc kubenswrapper[4922]: E0929 09:46:45.453109 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:45 crc kubenswrapper[4922]: I0929 09:46:45.453227 4922 scope.go:117] "RemoveContainer" containerID="0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273" Sep 29 09:46:45 crc kubenswrapper[4922]: E0929 09:46:45.453277 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:45 crc kubenswrapper[4922]: E0929 09:46:45.453490 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:45 crc kubenswrapper[4922]: E0929 09:46:45.453598 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:45 crc kubenswrapper[4922]: E0929 09:46:45.557967 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:46:46 crc kubenswrapper[4922]: I0929 09:46:46.220410 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/1.log" Sep 29 09:46:46 crc kubenswrapper[4922]: I0929 09:46:46.220773 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerStarted","Data":"86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa"} Sep 29 09:46:47 crc kubenswrapper[4922]: I0929 09:46:47.450978 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:47 crc kubenswrapper[4922]: I0929 09:46:47.451036 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:47 crc kubenswrapper[4922]: I0929 09:46:47.451064 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:47 crc kubenswrapper[4922]: I0929 09:46:47.451099 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:47 crc kubenswrapper[4922]: E0929 09:46:47.451229 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:47 crc kubenswrapper[4922]: E0929 09:46:47.451325 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:47 crc kubenswrapper[4922]: E0929 09:46:47.451464 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:47 crc kubenswrapper[4922]: E0929 09:46:47.451587 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:49 crc kubenswrapper[4922]: I0929 09:46:49.451447 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:49 crc kubenswrapper[4922]: I0929 09:46:49.451588 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:49 crc kubenswrapper[4922]: I0929 09:46:49.451614 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:49 crc kubenswrapper[4922]: I0929 09:46:49.451462 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:49 crc kubenswrapper[4922]: E0929 09:46:49.451745 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:49 crc kubenswrapper[4922]: E0929 09:46:49.451869 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:49 crc kubenswrapper[4922]: E0929 09:46:49.452569 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:49 crc kubenswrapper[4922]: E0929 09:46:49.452626 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:49 crc kubenswrapper[4922]: I0929 09:46:49.453007 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.237756 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/3.log" Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.240940 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerStarted","Data":"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff"} Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.241371 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.282681 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podStartSLOduration=115.282653804 podStartE2EDuration="1m55.282653804s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:46:50.281909444 +0000 UTC m=+135.648139738" watchObservedRunningTime="2025-09-29 09:46:50.282653804 +0000 UTC m=+135.648884058" Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.345990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9p9s8"] Sep 29 09:46:50 crc kubenswrapper[4922]: I0929 09:46:50.346112 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:50 crc kubenswrapper[4922]: E0929 09:46:50.346226 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:50 crc kubenswrapper[4922]: E0929 09:46:50.559003 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 09:46:51 crc kubenswrapper[4922]: I0929 09:46:51.451305 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:51 crc kubenswrapper[4922]: E0929 09:46:51.451502 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:51 crc kubenswrapper[4922]: I0929 09:46:51.451804 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:51 crc kubenswrapper[4922]: E0929 09:46:51.451965 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:51 crc kubenswrapper[4922]: I0929 09:46:51.452318 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:51 crc kubenswrapper[4922]: E0929 09:46:51.452564 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:52 crc kubenswrapper[4922]: I0929 09:46:52.450897 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:52 crc kubenswrapper[4922]: E0929 09:46:52.451149 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:53 crc kubenswrapper[4922]: I0929 09:46:53.451533 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:53 crc kubenswrapper[4922]: I0929 09:46:53.451630 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:53 crc kubenswrapper[4922]: I0929 09:46:53.451534 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:53 crc kubenswrapper[4922]: E0929 09:46:53.451798 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:53 crc kubenswrapper[4922]: E0929 09:46:53.451703 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:53 crc kubenswrapper[4922]: E0929 09:46:53.451986 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:54 crc kubenswrapper[4922]: I0929 09:46:54.450809 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:54 crc kubenswrapper[4922]: E0929 09:46:54.451001 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p9s8" podUID="48a99f27-a7b4-466d-b130-026774744f7d" Sep 29 09:46:55 crc kubenswrapper[4922]: I0929 09:46:55.450763 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:55 crc kubenswrapper[4922]: I0929 09:46:55.450928 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:55 crc kubenswrapper[4922]: I0929 09:46:55.450972 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:55 crc kubenswrapper[4922]: E0929 09:46:55.452198 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 09:46:55 crc kubenswrapper[4922]: E0929 09:46:55.452386 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 09:46:55 crc kubenswrapper[4922]: E0929 09:46:55.452509 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 09:46:56 crc kubenswrapper[4922]: I0929 09:46:56.451208 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:46:56 crc kubenswrapper[4922]: I0929 09:46:56.454541 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 29 09:46:56 crc kubenswrapper[4922]: I0929 09:46:56.454541 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.451458 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.451537 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.451659 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.454229 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.454327 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.454893 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 29 09:46:57 crc kubenswrapper[4922]: I0929 09:46:57.457823 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 29 09:46:59 crc kubenswrapper[4922]: I0929 09:46:59.070561 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:46:59 crc kubenswrapper[4922]: I0929 09:46:59.070658 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:46:59 crc kubenswrapper[4922]: I0929 09:46:59.540442 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.442461 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:03 crc kubenswrapper[4922]: E0929 09:47:03.442631 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:49:05.442594058 +0000 UTC m=+270.808824362 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.442954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.443014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.443052 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.443098 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.445042 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.451980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.452458 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.453178 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.480312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.499139 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:47:03 crc kubenswrapper[4922]: I0929 09:47:03.510593 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 09:47:03 crc kubenswrapper[4922]: W0929 09:47:03.779092 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-666afe62b8a9cbb2904733da393427ba70c5bcc532e8c1872841043b7a4e9af1 WatchSource:0}: Error finding container 666afe62b8a9cbb2904733da393427ba70c5bcc532e8c1872841043b7a4e9af1: Status 404 returned error can't find the container with id 666afe62b8a9cbb2904733da393427ba70c5bcc532e8c1872841043b7a4e9af1 Sep 29 09:47:04 crc kubenswrapper[4922]: W0929 09:47:04.029943 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-5223135a22b33767878e6fab6e5debe311894bb17189af5938a979be1a3c138a WatchSource:0}: Error finding container 5223135a22b33767878e6fab6e5debe311894bb17189af5938a979be1a3c138a: Status 404 returned error can't find the container with id 5223135a22b33767878e6fab6e5debe311894bb17189af5938a979be1a3c138a Sep 29 09:47:04 crc kubenswrapper[4922]: W0929 09:47:04.032742 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f08d073f0b9cd275e1cda62997f946f36e5385821c3b311bce099c79ca6f484e WatchSource:0}: Error finding container f08d073f0b9cd275e1cda62997f946f36e5385821c3b311bce099c79ca6f484e: Status 404 returned error can't find the container with id f08d073f0b9cd275e1cda62997f946f36e5385821c3b311bce099c79ca6f484e Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.296378 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c99beed1ce29c23c7ad89e35b094eb7c5f1dd1e7aa5ca6fb8791b53cf94ff9d4"} Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.296463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5223135a22b33767878e6fab6e5debe311894bb17189af5938a979be1a3c138a"} Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.296707 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.301046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"33e67bacf6d2bab6d29584d0ab19e6dfe3455cb7032787061b066225e3316195"} Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.301126 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"666afe62b8a9cbb2904733da393427ba70c5bcc532e8c1872841043b7a4e9af1"} Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.304954 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1acd2c77b56befd977f75c3a68b73f4240eca0f31a68f3a72d7047a20e6de341"} Sep 29 09:47:04 crc kubenswrapper[4922]: I0929 09:47:04.305015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f08d073f0b9cd275e1cda62997f946f36e5385821c3b311bce099c79ca6f484e"} Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.701024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.730027 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.730427 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.736607 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.736825 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.737020 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.737417 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.737589 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.737715 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.738031 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.738593 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.741589 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.752949 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.753159 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.755022 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.755218 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.755448 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.755461 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.756164 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.755674 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.756335 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.756956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.757282 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.757754 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.758898 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.764514 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765179 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765286 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765353 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765535 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765683 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.765895 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.766034 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.770319 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg855"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.770412 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.770695 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.770757 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.770858 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771098 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771309 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771352 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1b4639a-6a43-4035-9402-cd1006e94e45-machine-approver-tls\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771359 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771107 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771114 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771180 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771947 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.771375 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-serving-cert\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772239 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-encryption-config\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772270 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vtv\" (UniqueName: \"kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772292 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772334 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772379 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772399 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772426 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-dir\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772449 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772468 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772514 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sd8m\" (UniqueName: \"kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772577 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772593 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-client\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772619 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772663 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772682 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772692 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772727 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772787 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxtp\" (UniqueName: \"kubernetes.io/projected/5ab72b5a-8f7c-41aa-b179-68753a4c8100-kube-api-access-npxtp\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788925 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788980 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5kr\" (UniqueName: \"kubernetes.io/projected/f1b4639a-6a43-4035-9402-cd1006e94e45-kube-api-access-4d5kr\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789005 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8mrf\" (UniqueName: \"kubernetes.io/projected/3f0b9f48-6af7-4c04-8edf-417fc84261a6-kube-api-access-m8mrf\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789077 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789112 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbvq\" (UniqueName: \"kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789138 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-auth-proxy-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789159 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab72b5a-8f7c-41aa-b179-68753a4c8100-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789196 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-policies\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772868 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789292 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.780288 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptqfl"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789634 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789976 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.790258 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dw64l"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.790535 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.790949 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dbc7l"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791059 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791245 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791336 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-flszb"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791586 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791614 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.791708 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772903 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.772970 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.812446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.773052 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.773087 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.773124 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.812931 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.813329 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.773173 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.817751 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.818258 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.818438 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.773225 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788418 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788461 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788482 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788791 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.788925 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789013 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789047 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789095 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789128 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789162 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789192 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.789259 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.838061 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8ftc8"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.838528 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.838878 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jkk4k"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.839407 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.839921 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.839965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.840153 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.840261 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.840327 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.840372 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.840894 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.841798 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.841955 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.842294 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.842387 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.844209 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.844410 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.844431 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.844668 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.845056 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.847233 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.847440 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.848806 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.848982 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.849337 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.849425 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.849567 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.849842 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.850031 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.850686 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.861955 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xjmx"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.862484 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.862634 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.863453 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.864917 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.865364 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.869788 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.870530 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.870769 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.871093 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.879201 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880072 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880201 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880523 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880629 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880722 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880821 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.880945 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881060 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881183 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881383 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881483 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881590 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881685 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.881784 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.883272 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.883613 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.883754 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.883939 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884027 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884048 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884178 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884191 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884317 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884421 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884539 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884628 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884734 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884792 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884815 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884906 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884948 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.884913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.885028 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.886521 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.887187 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.887399 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.888176 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.889907 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.889941 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.889974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsc9\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-kube-api-access-xwsc9\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-dir\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-serving-cert\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890063 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-apiservice-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890158 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-service-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-config\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890188 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thwb\" (UniqueName: \"kubernetes.io/projected/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-kube-api-access-7thwb\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890207 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sd8m\" (UniqueName: \"kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890225 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-proxy-tls\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-client\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890257 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbx9d\" (UniqueName: \"kubernetes.io/projected/70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa-kube-api-access-lbx9d\") pod \"downloads-7954f5f757-8ftc8\" (UID: \"70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa\") " pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890318 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnftl\" (UniqueName: \"kubernetes.io/projected/4be461bc-d63f-4b94-951a-c8df10b91ab9-kube-api-access-mnftl\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890356 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jg2r\" (UniqueName: \"kubernetes.io/projected/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-kube-api-access-6jg2r\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxtp\" (UniqueName: \"kubernetes.io/projected/5ab72b5a-8f7c-41aa-b179-68753a4c8100-kube-api-access-npxtp\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890413 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890429 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-config\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-dir\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890445 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-trusted-ca\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890506 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890585 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8mrf\" (UniqueName: \"kubernetes.io/projected/3f0b9f48-6af7-4c04-8edf-417fc84261a6-kube-api-access-m8mrf\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890651 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb4829f-c7a4-450d-9912-564ef25c4aaf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-tmpfs\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5kr\" (UniqueName: \"kubernetes.io/projected/f1b4639a-6a43-4035-9402-cd1006e94e45-kube-api-access-4d5kr\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890714 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.890734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrqk\" (UniqueName: \"kubernetes.io/projected/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-kube-api-access-thrqk\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891141 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891234 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891270 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbvq\" (UniqueName: \"kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-auth-proxy-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891372 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab72b5a-8f7c-41aa-b179-68753a4c8100-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-config\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891643 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891748 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fnbn6"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891804 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891919 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-images\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.891996 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cjw\" (UniqueName: \"kubernetes.io/projected/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-kube-api-access-g8cjw\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892034 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892066 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-serving-cert\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892095 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892118 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/443a0f84-59dd-4acd-8299-f995b071562d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892159 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-policies\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892201 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4be461bc-d63f-4b94-951a-c8df10b91ab9-metrics-tls\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892225 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/443a0f84-59dd-4acd-8299-f995b071562d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892235 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jrk8f"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892258 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892281 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb4829f-c7a4-450d-9912-564ef25c4aaf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892304 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gmd\" (UniqueName: \"kubernetes.io/projected/8fb4829f-c7a4-450d-9912-564ef25c4aaf-kube-api-access-x4gmd\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892353 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892375 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9903f35e-2c84-46b8-a9de-c3920e709c83-serving-cert\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892576 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892787 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892819 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1b4639a-6a43-4035-9402-cd1006e94e45-machine-approver-tls\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892926 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-serving-cert\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.892941 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-encryption-config\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vtv\" (UniqueName: \"kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893024 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893045 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-webhook-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893106 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-audit-policies\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893108 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893169 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-client\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893239 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqrs\" (UniqueName: \"kubernetes.io/projected/9903f35e-2c84-46b8-a9de-c3920e709c83-kube-api-access-ppqrs\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893256 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893609 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1b4639a-6a43-4035-9402-cd1006e94e45-auth-proxy-config\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.893634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.894459 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.894850 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.895518 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.895812 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.895912 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.907810 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1b4639a-6a43-4035-9402-cd1006e94e45-machine-approver-tls\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.907890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-serving-cert\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.908162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-encryption-config\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.908330 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.908399 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ab72b5a-8f7c-41aa-b179-68753a4c8100-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.910373 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f0b9f48-6af7-4c04-8edf-417fc84261a6-etcd-client\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.910433 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.911970 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.913556 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.916143 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.916818 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.916944 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.917206 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.919620 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.921375 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.923927 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.928316 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.928603 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.929260 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.929954 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.930480 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.930600 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.935845 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.949948 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.952570 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.955066 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.955336 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zk7zr"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.958246 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.958369 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.959802 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dw64l"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.963341 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptqfl"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.966662 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg855"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.973452 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.975090 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.976169 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.977102 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.980924 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dbc7l"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.983624 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-flszb"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.986277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.987363 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qvp28"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.988081 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.989747 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt"] Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnftl\" (UniqueName: \"kubernetes.io/projected/4be461bc-d63f-4b94-951a-c8df10b91ab9-kube-api-access-mnftl\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jg2r\" (UniqueName: \"kubernetes.io/projected/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-kube-api-access-6jg2r\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-config\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994756 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-trusted-ca\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.994779 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb4829f-c7a4-450d-9912-564ef25c4aaf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.996974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-tmpfs\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrqk\" (UniqueName: \"kubernetes.io/projected/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-kube-api-access-thrqk\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-config\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-images\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cjw\" (UniqueName: \"kubernetes.io/projected/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-kube-api-access-g8cjw\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997226 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997244 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-serving-cert\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997262 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/443a0f84-59dd-4acd-8299-f995b071562d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997302 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/443a0f84-59dd-4acd-8299-f995b071562d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997323 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4be461bc-d63f-4b94-951a-c8df10b91ab9-metrics-tls\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997341 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb4829f-c7a4-450d-9912-564ef25c4aaf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997358 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gmd\" (UniqueName: \"kubernetes.io/projected/8fb4829f-c7a4-450d-9912-564ef25c4aaf-kube-api-access-x4gmd\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997374 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997389 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9903f35e-2c84-46b8-a9de-c3920e709c83-serving-cert\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997419 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-webhook-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-client\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997509 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqrs\" (UniqueName: \"kubernetes.io/projected/9903f35e-2c84-46b8-a9de-c3920e709c83-kube-api-access-ppqrs\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsc9\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-kube-api-access-xwsc9\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997547 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-serving-cert\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997587 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-apiservice-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997604 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thwb\" (UniqueName: \"kubernetes.io/projected/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-kube-api-access-7thwb\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997620 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-service-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997634 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-config\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997720 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-proxy-tls\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:05 crc kubenswrapper[4922]: I0929 09:47:05.997740 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbx9d\" (UniqueName: \"kubernetes.io/projected/70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa-kube-api-access-lbx9d\") pod \"downloads-7954f5f757-8ftc8\" (UID: \"70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa\") " pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:05.998672 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:05.998687 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:05.998969 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-config\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.000804 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.001154 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.001434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.001522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb4829f-c7a4-450d-9912-564ef25c4aaf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.002389 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-config\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.002639 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-service-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.002809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.003142 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-ca\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.003311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/443a0f84-59dd-4acd-8299-f995b071562d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.003481 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-config\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.003695 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.003870 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-tmpfs\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.004852 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9903f35e-2c84-46b8-a9de-c3920e709c83-trusted-ca\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.007239 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-serving-cert\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.007307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-etcd-client\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.007582 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xjmx"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.008762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4be461bc-d63f-4b94-951a-c8df10b91ab9-metrics-tls\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.010027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-serving-cert\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.010934 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.012808 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.013004 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.013099 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb4829f-c7a4-450d-9912-564ef25c4aaf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.014495 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/443a0f84-59dd-4acd-8299-f995b071562d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.014672 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jrk8f"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.016175 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.017523 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.018886 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.020158 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ftc8"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.020818 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9903f35e-2c84-46b8-a9de-c3920e709c83-serving-cert\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.022191 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jkk4k"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.023696 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.024964 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6tlvg"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.026011 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.026107 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.027039 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dcbw5"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.028309 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.028618 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.030129 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.031702 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.041738 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zk7zr"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.044435 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvp28"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.045264 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.046120 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.048795 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.050515 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.051901 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.053369 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.054707 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6tlvg"] Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.073017 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.094807 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.112606 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.131812 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.152374 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.173657 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.192787 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.197119 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-apiservice-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.204950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-webhook-cert\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.212338 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.233308 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.256257 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.273340 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.293947 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.312888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.332351 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.337109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.353918 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.373241 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.381939 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-config\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.393711 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.407648 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-proxy-tls\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.413095 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.433675 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.453623 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.462466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-images\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.493049 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.512455 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.533032 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.554574 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.574351 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.593946 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.612904 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.633911 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.654067 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.673234 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.692974 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.713303 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.733338 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.761086 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.774025 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.793284 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.813351 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.833193 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.873957 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.881189 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.890901 4922 request.go:700] Waited for 1.002440441s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-user-template-error&limit=500&resourceVersion=0 Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.892995 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.913149 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.932996 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.953153 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.973956 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 29 09:47:06 crc kubenswrapper[4922]: I0929 09:47:06.993192 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.026554 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.062608 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sd8m\" (UniqueName: \"kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m\") pod \"console-f9d7485db-4zgtm\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.081875 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5kr\" (UniqueName: \"kubernetes.io/projected/f1b4639a-6a43-4035-9402-cd1006e94e45-kube-api-access-4d5kr\") pod \"machine-approver-56656f9798-pql8x\" (UID: \"f1b4639a-6a43-4035-9402-cd1006e94e45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.106312 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxtp\" (UniqueName: \"kubernetes.io/projected/5ab72b5a-8f7c-41aa-b179-68753a4c8100-kube-api-access-npxtp\") pod \"cluster-samples-operator-665b6dd947-ppwz4\" (UID: \"5ab72b5a-8f7c-41aa-b179-68753a4c8100\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.121669 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbvq\" (UniqueName: \"kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq\") pod \"controller-manager-879f6c89f-4pzjz\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.133725 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.142992 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8mrf\" (UniqueName: \"kubernetes.io/projected/3f0b9f48-6af7-4c04-8edf-417fc84261a6-kube-api-access-m8mrf\") pod \"apiserver-7bbb656c7d-ccl4g\" (UID: \"3f0b9f48-6af7-4c04-8edf-417fc84261a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.153115 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.173708 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.193979 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.212910 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.253481 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.261566 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vtv\" (UniqueName: \"kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv\") pod \"route-controller-manager-6576b87f9c-9kd4c\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.263081 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.272741 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.274045 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.294360 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.312369 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.313259 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.319442 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" event={"ID":"f1b4639a-6a43-4035-9402-cd1006e94e45","Type":"ContainerStarted","Data":"03a37e5e50605e49a2a2170e79f95e8983443d87c0d1bb1720a7942941f15797"} Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.330723 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.332751 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.350703 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.353146 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.357644 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.373334 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.393124 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.414308 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.433715 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.458461 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.478535 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.503597 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.514991 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.523394 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.534807 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 09:47:07 crc kubenswrapper[4922]: W0929 09:47:07.537486 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf636765f_e16c_4597_88d7_327472ef1940.slice/crio-96477c43d183599280dfad23d3fc5408f7a5d8551fe76027339997fac9b4be81 WatchSource:0}: Error finding container 96477c43d183599280dfad23d3fc5408f7a5d8551fe76027339997fac9b4be81: Status 404 returned error can't find the container with id 96477c43d183599280dfad23d3fc5408f7a5d8551fe76027339997fac9b4be81 Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.552471 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.562960 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:47:07 crc kubenswrapper[4922]: W0929 09:47:07.572135 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda226dbbb_e39e_4fa1_aaab_1b28cffcfccd.slice/crio-356cd216a1e1397703f89c8bd9746decfbe9d74f5e3161335059ada49c3552d0 WatchSource:0}: Error finding container 356cd216a1e1397703f89c8bd9746decfbe9d74f5e3161335059ada49c3552d0: Status 404 returned error can't find the container with id 356cd216a1e1397703f89c8bd9746decfbe9d74f5e3161335059ada49c3552d0 Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.574043 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.601993 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.602905 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4"] Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.639519 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.639694 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.653258 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.663955 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.672246 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.694532 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.713053 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.732380 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.752300 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.773197 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.809018 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbx9d\" (UniqueName: \"kubernetes.io/projected/70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa-kube-api-access-lbx9d\") pod \"downloads-7954f5f757-8ftc8\" (UID: \"70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa\") " pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.832907 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnftl\" (UniqueName: \"kubernetes.io/projected/4be461bc-d63f-4b94-951a-c8df10b91ab9-kube-api-access-mnftl\") pod \"dns-operator-744455d44c-ptqfl\" (UID: \"4be461bc-d63f-4b94-951a-c8df10b91ab9\") " pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.849869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g"] Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.854677 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:07 crc kubenswrapper[4922]: W0929 09:47:07.865166 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0b9f48_6af7_4c04_8edf_417fc84261a6.slice/crio-6698a576d499c6607cb6fa9e549dfb201aa5018dc9717b7fc23021d9f2f7dc37 WatchSource:0}: Error finding container 6698a576d499c6607cb6fa9e549dfb201aa5018dc9717b7fc23021d9f2f7dc37: Status 404 returned error can't find the container with id 6698a576d499c6607cb6fa9e549dfb201aa5018dc9717b7fc23021d9f2f7dc37 Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.872348 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jg2r\" (UniqueName: \"kubernetes.io/projected/340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1-kube-api-access-6jg2r\") pod \"machine-config-operator-74547568cd-kc92q\" (UID: \"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.888455 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsc9\" (UniqueName: \"kubernetes.io/projected/443a0f84-59dd-4acd-8299-f995b071562d-kube-api-access-xwsc9\") pod \"cluster-image-registry-operator-dc59b4c8b-pwxh9\" (UID: \"443a0f84-59dd-4acd-8299-f995b071562d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.909924 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gmd\" (UniqueName: \"kubernetes.io/projected/8fb4829f-c7a4-450d-9912-564ef25c4aaf-kube-api-access-x4gmd\") pod \"openshift-controller-manager-operator-756b6f6bc6-d6z6c\" (UID: \"8fb4829f-c7a4-450d-9912-564ef25c4aaf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.911395 4922 request.go:700] Waited for 1.910427043s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.928934 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thwb\" (UniqueName: \"kubernetes.io/projected/3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3-kube-api-access-7thwb\") pod \"authentication-operator-69f744f599-dw64l\" (UID: \"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.934285 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.946898 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8af2d9ab-f32b-48be-b6f3-3f1f35a6953b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7n2c8\" (UID: \"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.966247 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrqk\" (UniqueName: \"kubernetes.io/projected/d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550-kube-api-access-thrqk\") pod \"etcd-operator-b45778765-flszb\" (UID: \"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550\") " pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.986936 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" Sep 29 09:47:07 crc kubenswrapper[4922]: I0929 09:47:07.993961 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cjw\" (UniqueName: \"kubernetes.io/projected/1e5a5810-dbd5-4c66-92dd-51d669dc6eb1-kube-api-access-g8cjw\") pod \"packageserver-d55dfcdfc-pnjfh\" (UID: \"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.009209 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqrs\" (UniqueName: \"kubernetes.io/projected/9903f35e-2c84-46b8-a9de-c3920e709c83-kube-api-access-ppqrs\") pod \"console-operator-58897d9998-jkk4k\" (UID: \"9903f35e-2c84-46b8-a9de-c3920e709c83\") " pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.013255 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.034496 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.036297 4922 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.037063 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.052573 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.057185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.075878 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.089187 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.093203 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.104167 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.116516 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.119534 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.159615 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.201914 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.232115 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243143 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlz4\" (UniqueName: \"kubernetes.io/projected/55bb4804-b952-4c2e-be70-a1ce082f8b6d-kube-api-access-grlz4\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243166 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243182 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243217 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9227cd1-8691-4427-a216-1348be4c56fb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243234 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf9t\" (UniqueName: \"kubernetes.io/projected/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-kube-api-access-txf9t\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243251 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67671cfa-2e1f-424a-bd8f-67e25492d817-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-encryption-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-proxy-tls\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243307 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67671cfa-2e1f-424a-bd8f-67e25492d817-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243349 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-srv-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243365 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dea0217e-c923-4045-9b4f-90a9eff30f93-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243403 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzx7\" (UniqueName: \"kubernetes.io/projected/d9227cd1-8691-4427-a216-1348be4c56fb-kube-api-access-ctzx7\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmz6\" (UniqueName: \"kubernetes.io/projected/df443056-e7b1-48d4-92d5-0cb23872d4c3-kube-api-access-rkmz6\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243493 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-images\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243509 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-serving-cert\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243538 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-profile-collector-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/df443056-e7b1-48d4-92d5-0cb23872d4c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243600 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zqq\" (UniqueName: \"kubernetes.io/projected/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-kube-api-access-r9zqq\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkndf\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-srv-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243687 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9227cd1-8691-4427-a216-1348be4c56fb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.243720 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245477 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245648 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mc6\" (UniqueName: \"kubernetes.io/projected/b2ea2f47-4732-47bd-9099-c503b5610f43-kube-api-access-b6mc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-config\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsktz\" (UniqueName: \"kubernetes.io/projected/dea0217e-c923-4045-9b4f-90a9eff30f93-kube-api-access-dsktz\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-client\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-image-import-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7hvj\" (UniqueName: \"kubernetes.io/projected/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-kube-api-access-b7hvj\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ea2f47-4732-47bd-9099-c503b5610f43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.245940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-metrics-tls\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246172 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-trusted-ca\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246395 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246449 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-node-pullsecrets\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246550 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmlm\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-kube-api-access-9jmlm\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246627 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246670 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.246697 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7xl\" (UniqueName: \"kubernetes.io/projected/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-kube-api-access-dn7xl\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.246821 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:08.746806306 +0000 UTC m=+154.113036570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247233 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67671cfa-2e1f-424a-bd8f-67e25492d817-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247270 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247334 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247390 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247420 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-config\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.247449 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit-dir\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.339481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" event={"ID":"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd","Type":"ContainerStarted","Data":"7047a1f497040128e27786096f8580c1c34c6bec9073fc72418389ed678a0080"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.339537 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" event={"ID":"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd","Type":"ContainerStarted","Data":"356cd216a1e1397703f89c8bd9746decfbe9d74f5e3161335059ada49c3552d0"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.339900 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348168 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grlz4\" (UniqueName: \"kubernetes.io/projected/55bb4804-b952-4c2e-be70-a1ce082f8b6d-kube-api-access-grlz4\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348415 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348447 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348493 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9227cd1-8691-4427-a216-1348be4c56fb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-config\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl95g\" (UniqueName: \"kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348549 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjmnv\" (UniqueName: \"kubernetes.io/projected/a2f39a22-ce04-4af3-a76f-adbba71624b6-kube-api-access-xjmnv\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348577 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf9t\" (UniqueName: \"kubernetes.io/projected/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-kube-api-access-txf9t\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67671cfa-2e1f-424a-bd8f-67e25492d817-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348619 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370b7e1-a94b-463b-b286-66912465b7fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348654 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348669 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348683 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348702 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348719 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzx7\" (UniqueName: \"kubernetes.io/projected/d9227cd1-8691-4427-a216-1348be4c56fb-kube-api-access-ctzx7\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348735 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmz6\" (UniqueName: \"kubernetes.io/projected/df443056-e7b1-48d4-92d5-0cb23872d4c3-kube-api-access-rkmz6\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348756 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfd4f\" (UniqueName: \"kubernetes.io/projected/7c02ebae-a241-4927-914e-ae531fc71bc0-kube-api-access-zfd4f\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-images\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348793 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzdc\" (UniqueName: \"kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348820 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkndf\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348896 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348913 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mc6\" (UniqueName: \"kubernetes.io/projected/b2ea2f47-4732-47bd-9099-c503b5610f43-kube-api-access-b6mc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-config\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348961 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.348998 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349268 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-default-certificate\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.349333 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:08.84929272 +0000 UTC m=+154.215522984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349378 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsktz\" (UniqueName: \"kubernetes.io/projected/dea0217e-c923-4045-9b4f-90a9eff30f93-kube-api-access-dsktz\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349413 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-csi-data-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349438 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-stats-auth\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349463 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6tg\" (UniqueName: \"kubernetes.io/projected/272adf48-5f20-40e4-9bf0-563c630f9e12-kube-api-access-8r6tg\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349486 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370b7e1-a94b-463b-b286-66912465b7fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349534 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ea2f47-4732-47bd-9099-c503b5610f43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7hvj\" (UniqueName: \"kubernetes.io/projected/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-kube-api-access-b7hvj\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-mountpoint-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349765 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/272adf48-5f20-40e4-9bf0-563c630f9e12-service-ca-bundle\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349786 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3de1e108-32f2-42b5-b5e8-1e0337a8f973-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349814 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67671cfa-2e1f-424a-bd8f-67e25492d817-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349886 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit-dir\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-key\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg7q\" (UniqueName: \"kubernetes.io/projected/aee42a43-42ec-4f54-b8c0-8721d2816541-kube-api-access-lhg7q\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349963 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.349987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350026 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-registration-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350051 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvx8v\" (UniqueName: \"kubernetes.io/projected/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-kube-api-access-fvx8v\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350073 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7rgz\" (UniqueName: \"kubernetes.io/projected/8395ee11-ec83-495d-843f-f41cdb86d8bb-kube-api-access-p7rgz\") pod \"migrator-59844c95c7-qz8cj\" (UID: \"8395ee11-ec83-495d-843f-f41cdb86d8bb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350109 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-serving-cert\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350132 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75pc\" (UniqueName: \"kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350180 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350203 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350236 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-cabundle\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350277 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350299 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-proxy-tls\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350364 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m45fw\" (UniqueName: \"kubernetes.io/projected/90a2c218-5fb1-4bcb-873d-a5667f73bc25-kube-api-access-m45fw\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350402 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-encryption-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350426 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67671cfa-2e1f-424a-bd8f-67e25492d817-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350463 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-node-bootstrap-token\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350555 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dea0217e-c923-4045-9b4f-90a9eff30f93-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350576 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350584 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350615 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-srv-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350643 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350665 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-config\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350688 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-metrics-certs\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350748 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-certs\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-plugins-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350950 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-serving-cert\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350961 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9227cd1-8691-4427-a216-1348be4c56fb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.350979 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-socket-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351006 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-profile-collector-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351033 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/df443056-e7b1-48d4-92d5-0cb23872d4c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351061 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zqq\" (UniqueName: \"kubernetes.io/projected/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-kube-api-access-r9zqq\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351090 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxph\" (UniqueName: \"kubernetes.io/projected/be0ebf80-6feb-4ac2-891e-c46c91dc7664-kube-api-access-7rxph\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9227cd1-8691-4427-a216-1348be4c56fb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-srv-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351739 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee42a43-42ec-4f54-b8c0-8721d2816541-config-volume\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.351760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5rc2\" (UniqueName: \"kubernetes.io/projected/3de1e108-32f2-42b5-b5e8-1e0337a8f973-kube-api-access-c5rc2\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.352060 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.352562 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.370150 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.370246 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67671cfa-2e1f-424a-bd8f-67e25492d817-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.373623 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-config\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.373693 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-audit-dir\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.374672 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-images\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.375267 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.376592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2ea2f47-4732-47bd-9099-c503b5610f43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.381603 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67671cfa-2e1f-424a-bd8f-67e25492d817-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.381991 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.382036 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387270 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387331 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be0ebf80-6feb-4ac2-891e-c46c91dc7664-cert\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-client\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387434 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-image-import-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-metrics-tls\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387531 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-trusted-ca\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-node-pullsecrets\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387769 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7xl\" (UniqueName: \"kubernetes.io/projected/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-kube-api-access-dn7xl\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.387805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aee42a43-42ec-4f54-b8c0-8721d2816541-metrics-tls\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.391047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.394981 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.395247 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-trusted-ca\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.395957 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/55bb4804-b952-4c2e-be70-a1ce082f8b6d-node-pullsecrets\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.396365 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:08.896338385 +0000 UTC m=+154.262568649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.396651 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9227cd1-8691-4427-a216-1348be4c56fb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.396923 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmlm\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-kube-api-access-9jmlm\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.396956 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.397234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-encryption-config\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.397747 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.397817 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-config\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.397872 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjw8\" (UniqueName: \"kubernetes.io/projected/4370b7e1-a94b-463b-b286-66912465b7fe-kube-api-access-6tjw8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.398661 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.400594 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" event={"ID":"8fb4829f-c7a4-450d-9912-564ef25c4aaf","Type":"ContainerStarted","Data":"c58b444a3951a2202cc171c3d3053465430678820bc626f20fffcf7455bbf929"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.400747 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea0217e-c923-4045-9b4f-90a9eff30f93-config\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.402362 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/55bb4804-b952-4c2e-be70-a1ce082f8b6d-image-import-ca\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.404798 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.415352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlz4\" (UniqueName: \"kubernetes.io/projected/55bb4804-b952-4c2e-be70-a1ce082f8b6d-kube-api-access-grlz4\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.421636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/df443056-e7b1-48d4-92d5-0cb23872d4c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.422939 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dw64l"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.424121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" event={"ID":"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1","Type":"ContainerStarted","Data":"c7af7f800326a5d5dbd2f412bd8a879565c77717699dcc09eafb9b653a2c9e5e"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.426402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzx7\" (UniqueName: \"kubernetes.io/projected/d9227cd1-8691-4427-a216-1348be4c56fb-kube-api-access-ctzx7\") pod \"openshift-config-operator-7777fb866f-jzsqd\" (UID: \"d9227cd1-8691-4427-a216-1348be4c56fb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.430046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" event={"ID":"5ab72b5a-8f7c-41aa-b179-68753a4c8100","Type":"ContainerStarted","Data":"a2cefc2b684cca7b0647473faa25582cb6c71861f670ea74f1cdc109e6e80102"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.430094 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" event={"ID":"5ab72b5a-8f7c-41aa-b179-68753a4c8100","Type":"ContainerStarted","Data":"cd02f345498c8becd7f8efb2bba0e957a54f8b60311f4bcc51963cd45843277c"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.430106 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" event={"ID":"5ab72b5a-8f7c-41aa-b179-68753a4c8100","Type":"ContainerStarted","Data":"d93be83c0ef3ad1b30311ee226498c1f860cd0946e8e6a1db7132c7cf1b1cb60"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.438605 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.438221 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-profile-collector-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.453033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dea0217e-c923-4045-9b4f-90a9eff30f93-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.453347 4922 generic.go:334] "Generic (PLEG): container finished" podID="3f0b9f48-6af7-4c04-8edf-417fc84261a6" containerID="c08dde9d10c3144d65ba299beea2ccfa8df4a23c9bbb83967f36ee174f1f028f" exitCode=0 Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.453433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" event={"ID":"3f0b9f48-6af7-4c04-8edf-417fc84261a6","Type":"ContainerDied","Data":"c08dde9d10c3144d65ba299beea2ccfa8df4a23c9bbb83967f36ee174f1f028f"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.453465 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" event={"ID":"3f0b9f48-6af7-4c04-8edf-417fc84261a6","Type":"ContainerStarted","Data":"6698a576d499c6607cb6fa9e549dfb201aa5018dc9717b7fc23021d9f2f7dc37"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.454439 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-srv-cert\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.455541 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.456097 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ftc8"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.457709 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-serving-cert\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.461014 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-metrics-tls\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: W0929 09:47:08.461052 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70b9ec8e_e3e5_45b5_abf4_3d89fa8530aa.slice/crio-8b34845662b4e3a296e6c43c4e9b82275ca269e34a4989e0aecb432d7c11db65 WatchSource:0}: Error finding container 8b34845662b4e3a296e6c43c4e9b82275ca269e34a4989e0aecb432d7c11db65: Status 404 returned error can't find the container with id 8b34845662b4e3a296e6c43c4e9b82275ca269e34a4989e0aecb432d7c11db65 Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.461540 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/55bb4804-b952-4c2e-be70-a1ce082f8b6d-etcd-client\") pod \"apiserver-76f77b778f-gg855\" (UID: \"55bb4804-b952-4c2e-be70-a1ce082f8b6d\") " pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.464492 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-proxy-tls\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.465275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-srv-cert\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.467262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" event={"ID":"f636765f-e16c-4597-88d7-327472ef1940","Type":"ContainerStarted","Data":"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.467347 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jkk4k"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.467369 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" event={"ID":"f636765f-e16c-4597-88d7-327472ef1940","Type":"ContainerStarted","Data":"96477c43d183599280dfad23d3fc5408f7a5d8551fe76027339997fac9b4be81"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.467752 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.468333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf9t\" (UniqueName: \"kubernetes.io/projected/1e73604d-9d90-4f5e-bbcb-0ad4272e6553-kube-api-access-txf9t\") pod \"catalog-operator-68c6474976-r8nms\" (UID: \"1e73604d-9d90-4f5e-bbcb-0ad4272e6553\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.471517 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmz6\" (UniqueName: \"kubernetes.io/projected/df443056-e7b1-48d4-92d5-0cb23872d4c3-kube-api-access-rkmz6\") pod \"multus-admission-controller-857f4d67dd-6xjmx\" (UID: \"df443056-e7b1-48d4-92d5-0cb23872d4c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.472295 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkndf\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.478333 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.486651 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4pzjz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.486715 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" podUID="f636765f-e16c-4597-88d7-327472ef1940" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.491517 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.493012 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.495384 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" event={"ID":"f1b4639a-6a43-4035-9402-cd1006e94e45","Type":"ContainerStarted","Data":"e2c232c3ba995e1b4f691d64160c391075b2e97f6a38e2c5236d9069c7a02329"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.495443 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" event={"ID":"f1b4639a-6a43-4035-9402-cd1006e94e45","Type":"ContainerStarted","Data":"c4fe1bc9cfccc5b93d7be6ee6eb54346be59356bd66d4861ddac78373359d5df"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499260 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499297 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4zgtm" event={"ID":"48e2c6f9-1502-4fa6-854d-ef25455dadb1","Type":"ContainerStarted","Data":"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4zgtm" event={"ID":"48e2c6f9-1502-4fa6-854d-ef25455dadb1","Type":"ContainerStarted","Data":"648344bb67c4c2d01c05bb54d1405ace830c6ac7d986664f00917e1afa007c4f"} Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxph\" (UniqueName: \"kubernetes.io/projected/be0ebf80-6feb-4ac2-891e-c46c91dc7664-kube-api-access-7rxph\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5rc2\" (UniqueName: \"kubernetes.io/projected/3de1e108-32f2-42b5-b5e8-1e0337a8f973-kube-api-access-c5rc2\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.499899 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:08.999879308 +0000 UTC m=+154.366109582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.499999 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee42a43-42ec-4f54-b8c0-8721d2816541-config-volume\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500034 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500059 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be0ebf80-6feb-4ac2-891e-c46c91dc7664-cert\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500083 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500115 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500139 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aee42a43-42ec-4f54-b8c0-8721d2816541-metrics-tls\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500201 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500229 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjw8\" (UniqueName: \"kubernetes.io/projected/4370b7e1-a94b-463b-b286-66912465b7fe-kube-api-access-6tjw8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl95g\" (UniqueName: \"kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500311 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjmnv\" (UniqueName: \"kubernetes.io/projected/a2f39a22-ce04-4af3-a76f-adbba71624b6-kube-api-access-xjmnv\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-config\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500361 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370b7e1-a94b-463b-b286-66912465b7fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500388 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500411 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500433 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500477 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfd4f\" (UniqueName: \"kubernetes.io/projected/7c02ebae-a241-4927-914e-ae531fc71bc0-kube-api-access-zfd4f\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzdc\" (UniqueName: \"kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500602 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500627 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-default-certificate\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500659 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500683 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-csi-data-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-stats-auth\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6tg\" (UniqueName: \"kubernetes.io/projected/272adf48-5f20-40e4-9bf0-563c630f9e12-kube-api-access-8r6tg\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500761 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370b7e1-a94b-463b-b286-66912465b7fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500789 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500813 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-mountpoint-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/272adf48-5f20-40e4-9bf0-563c630f9e12-service-ca-bundle\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.500973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3de1e108-32f2-42b5-b5e8-1e0337a8f973-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501010 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-key\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501033 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg7q\" (UniqueName: \"kubernetes.io/projected/aee42a43-42ec-4f54-b8c0-8721d2816541-kube-api-access-lhg7q\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501272 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvx8v\" (UniqueName: \"kubernetes.io/projected/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-kube-api-access-fvx8v\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501296 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7rgz\" (UniqueName: \"kubernetes.io/projected/8395ee11-ec83-495d-843f-f41cdb86d8bb-kube-api-access-p7rgz\") pod \"migrator-59844c95c7-qz8cj\" (UID: \"8395ee11-ec83-495d-843f-f41cdb86d8bb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501323 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-registration-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-serving-cert\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501382 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k75pc\" (UniqueName: \"kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501406 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501430 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-cabundle\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m45fw\" (UniqueName: \"kubernetes.io/projected/90a2c218-5fb1-4bcb-873d-a5667f73bc25-kube-api-access-m45fw\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501554 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-node-bootstrap-token\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501581 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501605 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-config\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501625 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-metrics-certs\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-certs\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501685 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-plugins-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.501715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-socket-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.502080 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-socket-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.503297 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.503456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee42a43-42ec-4f54-b8c0-8721d2816541-config-volume\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.503657 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-registration-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.503678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.504342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.505344 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-csi-data-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.505871 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.005823226 +0000 UTC m=+154.372053490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.506018 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.506866 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-config\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.506871 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-cabundle\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.507536 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.508276 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be0ebf80-6feb-4ac2-891e-c46c91dc7664-cert\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.508511 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.508734 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-serving-cert\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.510076 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4370b7e1-a94b-463b-b286-66912465b7fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.511161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-config\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.511478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-plugins-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.511618 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a2f39a22-ce04-4af3-a76f-adbba71624b6-mountpoint-dir\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.513090 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90a2c218-5fb1-4bcb-873d-a5667f73bc25-signing-key\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.516285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4370b7e1-a94b-463b-b286-66912465b7fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.517034 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.517321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.517556 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.517742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.517811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-metrics-certs\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.518098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.518457 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/272adf48-5f20-40e4-9bf0-563c630f9e12-service-ca-bundle\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.520027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.521681 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.522350 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.522887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.523277 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-stats-auth\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.523644 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.525233 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3de1e108-32f2-42b5-b5e8-1e0337a8f973-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.526263 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.531236 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7hvj\" (UniqueName: \"kubernetes.io/projected/7fdbfe98-a7d7-42eb-8d95-b1c6de178188-kube-api-access-b7hvj\") pod \"openshift-apiserver-operator-796bbdcf4f-94448\" (UID: \"7fdbfe98-a7d7-42eb-8d95-b1c6de178188\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.531500 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.533312 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-node-bootstrap-token\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.533458 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.542545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aee42a43-42ec-4f54-b8c0-8721d2816541-metrics-tls\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.542798 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7c02ebae-a241-4927-914e-ae531fc71bc0-certs\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.542982 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.554513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsktz\" (UniqueName: \"kubernetes.io/projected/dea0217e-c923-4045-9b4f-90a9eff30f93-kube-api-access-dsktz\") pod \"machine-api-operator-5694c8668f-dbc7l\" (UID: \"dea0217e-c923-4045-9b4f-90a9eff30f93\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.557447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/272adf48-5f20-40e4-9bf0-563c630f9e12-default-certificate\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.568401 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ptqfl"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.569114 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-flszb"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.572464 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mc6\" (UniqueName: \"kubernetes.io/projected/b2ea2f47-4732-47bd-9099-c503b5610f43-kube-api-access-b6mc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-cftkt\" (UID: \"b2ea2f47-4732-47bd-9099-c503b5610f43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.583092 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.588683 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9"] Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.591774 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.602412 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.604043 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.104022805 +0000 UTC m=+154.470253069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.612114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67671cfa-2e1f-424a-bd8f-67e25492d817-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlv5h\" (UID: \"67671cfa-2e1f-424a-bd8f-67e25492d817\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.648944 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zqq\" (UniqueName: \"kubernetes.io/projected/3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907-kube-api-access-r9zqq\") pod \"machine-config-controller-84d6567774-kxkh6\" (UID: \"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: W0929 09:47:08.651477 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be461bc_d63f_4b94_951a_c8df10b91ab9.slice/crio-ca2d42cd58b8b5c616c087df21d24d887afc8cdcf87dad6a2ea829dcfd93c119 WatchSource:0}: Error finding container ca2d42cd58b8b5c616c087df21d24d887afc8cdcf87dad6a2ea829dcfd93c119: Status 404 returned error can't find the container with id ca2d42cd58b8b5c616c087df21d24d887afc8cdcf87dad6a2ea829dcfd93c119 Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.652011 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7xl\" (UniqueName: \"kubernetes.io/projected/9fb5ca8c-3962-48f5-af8b-13cf7a012c8d-kube-api-access-dn7xl\") pod \"olm-operator-6b444d44fb-lkql5\" (UID: \"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: W0929 09:47:08.668774 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443a0f84_59dd_4acd_8299_f995b071562d.slice/crio-5e32472f39c1e68a58eb90542040d9cf2f0e405bff3041b27d2f7fbf711b9fc2 WatchSource:0}: Error finding container 5e32472f39c1e68a58eb90542040d9cf2f0e405bff3041b27d2f7fbf711b9fc2: Status 404 returned error can't find the container with id 5e32472f39c1e68a58eb90542040d9cf2f0e405bff3041b27d2f7fbf711b9fc2 Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.669897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmlm\" (UniqueName: \"kubernetes.io/projected/6d12cf52-a4bb-4de2-8a47-d6e6bd452c43-kube-api-access-9jmlm\") pod \"ingress-operator-5b745b69d9-vwz5w\" (UID: \"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.677046 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.680622 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.682911 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.691809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxph\" (UniqueName: \"kubernetes.io/projected/be0ebf80-6feb-4ac2-891e-c46c91dc7664-kube-api-access-7rxph\") pod \"ingress-canary-qvp28\" (UID: \"be0ebf80-6feb-4ac2-891e-c46c91dc7664\") " pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.704348 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.704780 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.204761573 +0000 UTC m=+154.570991837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.725283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5rc2\" (UniqueName: \"kubernetes.io/projected/3de1e108-32f2-42b5-b5e8-1e0337a8f973-kube-api-access-c5rc2\") pod \"package-server-manager-789f6589d5-l9gl6\" (UID: \"3de1e108-32f2-42b5-b5e8-1e0337a8f973\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.728404 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvx8v\" (UniqueName: \"kubernetes.io/projected/4bfe4903-5df0-40a6-9216-56d1a6ae2f9b-kube-api-access-fvx8v\") pod \"service-ca-operator-777779d784-vnsrv\" (UID: \"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.734593 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.744179 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.752741 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.758615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7rgz\" (UniqueName: \"kubernetes.io/projected/8395ee11-ec83-495d-843f-f41cdb86d8bb-kube-api-access-p7rgz\") pod \"migrator-59844c95c7-qz8cj\" (UID: \"8395ee11-ec83-495d-843f-f41cdb86d8bb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.770066 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.780542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xnmw7\" (UID: \"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.786387 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.793325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfd4f\" (UniqueName: \"kubernetes.io/projected/7c02ebae-a241-4927-914e-ae531fc71bc0-kube-api-access-zfd4f\") pod \"machine-config-server-dcbw5\" (UID: \"7c02ebae-a241-4927-914e-ae531fc71bc0\") " pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.805650 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.806324 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.306278561 +0000 UTC m=+154.672508825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.808713 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.813218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzdc\" (UniqueName: \"kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc\") pod \"marketplace-operator-79b997595-5mr24\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.840371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75pc\" (UniqueName: \"kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc\") pod \"oauth-openshift-558db77b4-wd7hc\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.853916 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.855941 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.869730 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6tg\" (UniqueName: \"kubernetes.io/projected/272adf48-5f20-40e4-9bf0-563c630f9e12-kube-api-access-8r6tg\") pod \"router-default-5444994796-fnbn6\" (UID: \"272adf48-5f20-40e4-9bf0-563c630f9e12\") " pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.870065 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.882927 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.892795 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.893357 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m45fw\" (UniqueName: \"kubernetes.io/projected/90a2c218-5fb1-4bcb-873d-a5667f73bc25-kube-api-access-m45fw\") pod \"service-ca-9c57cc56f-jrk8f\" (UID: \"90a2c218-5fb1-4bcb-873d-a5667f73bc25\") " pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.896500 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.908295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:08 crc kubenswrapper[4922]: E0929 09:47:08.909440 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.409418192 +0000 UTC m=+154.775648466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.915620 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.921402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg7q\" (UniqueName: \"kubernetes.io/projected/aee42a43-42ec-4f54-b8c0-8721d2816541-kube-api-access-lhg7q\") pod \"dns-default-zk7zr\" (UID: \"aee42a43-42ec-4f54-b8c0-8721d2816541\") " pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.922930 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.927322 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjw8\" (UniqueName: \"kubernetes.io/projected/4370b7e1-a94b-463b-b286-66912465b7fe-kube-api-access-6tjw8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2nmkn\" (UID: \"4370b7e1-a94b-463b-b286-66912465b7fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.928956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvp28" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.934913 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl95g\" (UniqueName: \"kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g\") pod \"collect-profiles-29318985-bq6tn\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.977310 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dcbw5" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.985711 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjmnv\" (UniqueName: \"kubernetes.io/projected/a2f39a22-ce04-4af3-a76f-adbba71624b6-kube-api-access-xjmnv\") pod \"csi-hostpathplugin-6tlvg\" (UID: \"a2f39a22-ce04-4af3-a76f-adbba71624b6\") " pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:08 crc kubenswrapper[4922]: I0929 09:47:08.986589 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6xjmx"] Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.011136 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.011440 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.511418353 +0000 UTC m=+154.877648617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.013069 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms"] Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.113687 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.114248 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.614231016 +0000 UTC m=+154.980461280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.117926 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg855"] Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.162905 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.177473 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.208037 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.215521 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.216633 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.716610797 +0000 UTC m=+155.082841061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.252264 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.317636 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.317950 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.81793881 +0000 UTC m=+155.184169074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.335705 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dbc7l"] Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.420122 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.420969 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:09.920951109 +0000 UTC m=+155.287181373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.499350 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4zgtm" podStartSLOduration=133.499329699 podStartE2EDuration="2m13.499329699s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:09.498061856 +0000 UTC m=+154.864292130" watchObservedRunningTime="2025-09-29 09:47:09.499329699 +0000 UTC m=+154.865559963" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.517594 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" event={"ID":"df443056-e7b1-48d4-92d5-0cb23872d4c3","Type":"ContainerStarted","Data":"e2e7d2a38c1a75d36a6f4ae4c909b89463051050ba075e76957017d05daf83b0"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.525018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.525258 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.0252469 +0000 UTC m=+155.391477164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.528976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" event={"ID":"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3","Type":"ContainerStarted","Data":"c0ef24708a72a3b064cf717c22f69068336589a4b8963f68368f118bf6f5d7c0"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.529015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" event={"ID":"3bb8e9b1-bc1f-4887-bc0a-e27d9b3555b3","Type":"ContainerStarted","Data":"395c5c8de5e07841712a68441e27a1badfdeeccca7165ba495a422238219b2c5"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.548508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" event={"ID":"1e73604d-9d90-4f5e-bbcb-0ad4272e6553","Type":"ContainerStarted","Data":"5656153a339f33bbbd3df636bff5c703e571ce8c2cfe4d3aebf27c0f59a82e4c"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.569337 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" event={"ID":"3f0b9f48-6af7-4c04-8edf-417fc84261a6","Type":"ContainerStarted","Data":"a9cadc954092b82c032095bca2e1d67a22feb0730dacdcacdaa82865a0c70a93"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.592211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" event={"ID":"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b","Type":"ContainerStarted","Data":"f4db4d5deb5a25dfdd848956fb4ddc41e4e27cc919d3dafa9f31b9efd8bc5df7"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.592274 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" event={"ID":"8af2d9ab-f32b-48be-b6f3-3f1f35a6953b","Type":"ContainerStarted","Data":"3f70452527f448864d0998a9f8e4d36d3fdf839d34d0aba7ef49978176cb5f01"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.596638 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" event={"ID":"9903f35e-2c84-46b8-a9de-c3920e709c83","Type":"ContainerStarted","Data":"7b3caef1d3c561154185510dd01ef2b2f160730f86d7c91624a6260f61ed43a8"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.596689 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" event={"ID":"9903f35e-2c84-46b8-a9de-c3920e709c83","Type":"ContainerStarted","Data":"2b3b05599cfa3d019418310e1621c46f6019c2cea9a840afbb19599d26ead2a5"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.597950 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.603044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg855" event={"ID":"55bb4804-b952-4c2e-be70-a1ce082f8b6d","Type":"ContainerStarted","Data":"a47278cefd9ceb6ab104061990ddecf84261e1d4b2607aa47ab790304fa040ad"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.603068 4922 patch_prober.go:28] interesting pod/console-operator-58897d9998-jkk4k container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Sep 29 09:47:09 crc kubenswrapper[4922]: W0929 09:47:09.603119 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c02ebae_a241_4927_914e_ae531fc71bc0.slice/crio-b7f1bf1918d69bd1af9051a817bc0e764207dbc7465bd8d9511f3c3018f22488 WatchSource:0}: Error finding container b7f1bf1918d69bd1af9051a817bc0e764207dbc7465bd8d9511f3c3018f22488: Status 404 returned error can't find the container with id b7f1bf1918d69bd1af9051a817bc0e764207dbc7465bd8d9511f3c3018f22488 Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.603145 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" podUID="9903f35e-2c84-46b8-a9de-c3920e709c83" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.619323 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" event={"ID":"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550","Type":"ContainerStarted","Data":"4f39dd0e608a3d0e4b1b9cd735cb3a028efeb6b6ba6e99b97259dc7a5ed5cebd"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.619473 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" event={"ID":"d2ac64a1-a8d6-4c8e-8f8b-0eb37009d550","Type":"ContainerStarted","Data":"1f65c5b4e726da4333cbc97fb79a569b9ff2641d6f2e5ea110f9f54a8de4c906"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.625778 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.627525 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.127504928 +0000 UTC m=+155.493735202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.629363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" event={"ID":"4be461bc-d63f-4b94-951a-c8df10b91ab9","Type":"ContainerStarted","Data":"ca2d42cd58b8b5c616c087df21d24d887afc8cdcf87dad6a2ea829dcfd93c119"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.639618 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" event={"ID":"8fb4829f-c7a4-450d-9912-564ef25c4aaf","Type":"ContainerStarted","Data":"5cbd772d275ad4550fb9b2f99cfcb3953f4d51d173d4bf23320e9fe8d72e258d"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.650047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" event={"ID":"443a0f84-59dd-4acd-8299-f995b071562d","Type":"ContainerStarted","Data":"5e32472f39c1e68a58eb90542040d9cf2f0e405bff3041b27d2f7fbf711b9fc2"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.677058 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" event={"ID":"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1","Type":"ContainerStarted","Data":"63016b1c9cacfd8e99873eaac343c8e18397421a79f4a60f86add44752171823"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.677096 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" event={"ID":"340b5c38-0c4d-40ba-a16d-2cfdee9eb6f1","Type":"ContainerStarted","Data":"0409bf68cacdfbfb45359e01781cd93ae17ae88dc334075ad8ccf8f44787d3db"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.685010 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" event={"ID":"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1","Type":"ContainerStarted","Data":"c9c04f7ec5101b6c546e6108689407245fa663c912d16e79d30ae546705cbed8"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.685071 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" event={"ID":"1e5a5810-dbd5-4c66-92dd-51d669dc6eb1","Type":"ContainerStarted","Data":"be2f0ff162ba9f31120c5b65742c6872ae72f52a4d2426b09f8221cfac3933b2"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.685505 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.690316 4922 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pnjfh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.690376 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" podUID="1e5a5810-dbd5-4c66-92dd-51d669dc6eb1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.692956 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ftc8" event={"ID":"70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa","Type":"ContainerStarted","Data":"e21888380b92454a3275798d5d36301fc8f89cc8226d70124c770606bc54e630"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.693045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ftc8" event={"ID":"70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa","Type":"ContainerStarted","Data":"8b34845662b4e3a296e6c43c4e9b82275ca269e34a4989e0aecb432d7c11db65"} Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.697843 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.711990 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ftc8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.712067 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ftc8" podUID="70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.712434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.732661 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.733007 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.232981413 +0000 UTC m=+155.599211677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.836409 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:09 crc kubenswrapper[4922]: E0929 09:47:09.839587 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.339564416 +0000 UTC m=+155.705794680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.840745 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" podStartSLOduration=133.840709306 podStartE2EDuration="2m13.840709306s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:09.793362103 +0000 UTC m=+155.159592367" watchObservedRunningTime="2025-09-29 09:47:09.840709306 +0000 UTC m=+155.206939580" Sep 29 09:47:09 crc kubenswrapper[4922]: I0929 09:47:09.961366 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pql8x" podStartSLOduration=134.961316994 podStartE2EDuration="2m14.961316994s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:09.957551513 +0000 UTC m=+155.323781777" watchObservedRunningTime="2025-09-29 09:47:09.961316994 +0000 UTC m=+155.327547278" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.021067 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.021522 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.521505079 +0000 UTC m=+155.887735343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.122630 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.123048 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.623023087 +0000 UTC m=+155.989253351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.123804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.124484 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.624463525 +0000 UTC m=+155.990693789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.141723 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" podStartSLOduration=134.141702736 podStartE2EDuration="2m14.141702736s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.132659094 +0000 UTC m=+155.498889358" watchObservedRunningTime="2025-09-29 09:47:10.141702736 +0000 UTC m=+155.507933000" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.226503 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.227081 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.727057143 +0000 UTC m=+156.093287407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.228615 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.246022 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.259038 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.334517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.334861 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.834849638 +0000 UTC m=+156.201079902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.366564 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.452496 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.459022 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.459571 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:10.959541365 +0000 UTC m=+156.325771629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.465920 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.510905 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ppwz4" podStartSLOduration=135.510888004 podStartE2EDuration="2m15.510888004s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.510306058 +0000 UTC m=+155.876536342" watchObservedRunningTime="2025-09-29 09:47:10.510888004 +0000 UTC m=+155.877118268" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.554540 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6"] Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.562000 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.563194 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.063180209 +0000 UTC m=+156.429410473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.662578 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.662818 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.162804447 +0000 UTC m=+156.529034711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.729809 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" event={"ID":"dea0217e-c923-4045-9b4f-90a9eff30f93","Type":"ContainerStarted","Data":"5f002fdde09103f243598c98be673d1704fde6b5e1b5b621f63b434da75ffc39"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.729885 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" event={"ID":"dea0217e-c923-4045-9b4f-90a9eff30f93","Type":"ContainerStarted","Data":"bd70c44fabd17841cecc4d7edde4ed7ae3ecfc4a1da19a560e7b86875c8a5414"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.769248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.769819 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.269809101 +0000 UTC m=+156.636039375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.776981 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" event={"ID":"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d","Type":"ContainerStarted","Data":"853798f484447569b5e2bf7c85bbbd3e1da470f007ee691d1627837afe4721e0"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.789458 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" event={"ID":"7fdbfe98-a7d7-42eb-8d95-b1c6de178188","Type":"ContainerStarted","Data":"e62c13c6a1c6dae44ee1dd7d3c4ba934197988b54096f3116f1ab37f762bf6c5"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.794636 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fnbn6" event={"ID":"272adf48-5f20-40e4-9bf0-563c630f9e12","Type":"ContainerStarted","Data":"03d613bc4ef79e16e7ffeeac7f0b34fdd0a0504f1242b192083d2beb7672b00a"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.794681 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fnbn6" event={"ID":"272adf48-5f20-40e4-9bf0-563c630f9e12","Type":"ContainerStarted","Data":"327d448013c89d75b2b10af7f1d47333bf1718f3db03461efce720506185fb61"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.804443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kc92q" podStartSLOduration=134.804428345 podStartE2EDuration="2m14.804428345s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.758186161 +0000 UTC m=+156.124416425" watchObservedRunningTime="2025-09-29 09:47:10.804428345 +0000 UTC m=+156.170658599" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.805292 4922 generic.go:334] "Generic (PLEG): container finished" podID="55bb4804-b952-4c2e-be70-a1ce082f8b6d" containerID="882b5945b15ce927504b798c53c1e27460e40ecb1771754b96a89ab6b1eb4a18" exitCode=0 Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.805350 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg855" event={"ID":"55bb4804-b952-4c2e-be70-a1ce082f8b6d","Type":"ContainerDied","Data":"882b5945b15ce927504b798c53c1e27460e40ecb1771754b96a89ab6b1eb4a18"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.820686 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" event={"ID":"1e73604d-9d90-4f5e-bbcb-0ad4272e6553","Type":"ContainerStarted","Data":"dc7fdfa5d3613346e125988a73c104a8502e67916eb6656cf56b31a953488b33"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.821043 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.841687 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-flszb" podStartSLOduration=134.841671068 podStartE2EDuration="2m14.841671068s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.838798961 +0000 UTC m=+156.205029225" watchObservedRunningTime="2025-09-29 09:47:10.841671068 +0000 UTC m=+156.207901332" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.842448 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d6z6c" podStartSLOduration=134.842443668 podStartE2EDuration="2m14.842443668s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.804845455 +0000 UTC m=+156.171075719" watchObservedRunningTime="2025-09-29 09:47:10.842443668 +0000 UTC m=+156.208673932" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.847376 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" event={"ID":"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b","Type":"ContainerStarted","Data":"5cc1a541a575fe78965e6fb0004423e755240eefb25171760eb4ab3019ddd62a"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.853557 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dcbw5" event={"ID":"7c02ebae-a241-4927-914e-ae531fc71bc0","Type":"ContainerStarted","Data":"cd1bf0a0d8c4f16f85dfb3436fb242b3f1fc8e03ea854b3d10cb02b28ef38303"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.853607 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dcbw5" event={"ID":"7c02ebae-a241-4927-914e-ae531fc71bc0","Type":"ContainerStarted","Data":"b7f1bf1918d69bd1af9051a817bc0e764207dbc7465bd8d9511f3c3018f22488"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.869076 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" podStartSLOduration=135.869048518 podStartE2EDuration="2m15.869048518s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.859540924 +0000 UTC m=+156.225771188" watchObservedRunningTime="2025-09-29 09:47:10.869048518 +0000 UTC m=+156.235278782" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.874306 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.877363 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.377328059 +0000 UTC m=+156.743558323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.883957 4922 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-r8nms container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.884028 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" podUID="1e73604d-9d90-4f5e-bbcb-0ad4272e6553" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.884103 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.888397 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.888419 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.914518 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" event={"ID":"df443056-e7b1-48d4-92d5-0cb23872d4c3","Type":"ContainerStarted","Data":"62aff4cf788f01bb5bbcdb64574ba7bfe35abe6d29baf9ee76f78225219b926b"} Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.976279 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:10 crc kubenswrapper[4922]: I0929 09:47:10.977157 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" event={"ID":"b2ea2f47-4732-47bd-9099-c503b5610f43","Type":"ContainerStarted","Data":"02b91d26d01f1b15448cac8bd4909ce394489906806ce77c480974df6a6fdb9e"} Sep 29 09:47:10 crc kubenswrapper[4922]: E0929 09:47:10.977441 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.477423989 +0000 UTC m=+156.843654253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.023133 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" podStartSLOduration=135.023112458 podStartE2EDuration="2m15.023112458s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.018436974 +0000 UTC m=+156.384667238" watchObservedRunningTime="2025-09-29 09:47:11.023112458 +0000 UTC m=+156.389342722" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.023934 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7n2c8" podStartSLOduration=135.02393007 podStartE2EDuration="2m15.02393007s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:10.929033659 +0000 UTC m=+156.295263923" watchObservedRunningTime="2025-09-29 09:47:11.02393007 +0000 UTC m=+156.390160334" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.034243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" event={"ID":"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43","Type":"ContainerStarted","Data":"b26588ef78815e11b7a0433da61f6fc634f07bbb89a7fb7d7c4752026022662d"} Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.036202 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.054183 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.062591 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8ftc8" podStartSLOduration=135.06256538 podStartE2EDuration="2m15.06256538s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.052937773 +0000 UTC m=+156.419168047" watchObservedRunningTime="2025-09-29 09:47:11.06256538 +0000 UTC m=+156.428795664" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.081843 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.082284 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.582257596 +0000 UTC m=+156.948487860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.102091 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6tlvg"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.147983 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" podStartSLOduration=135.147957438 podStartE2EDuration="2m15.147957438s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.134388147 +0000 UTC m=+156.500618411" watchObservedRunningTime="2025-09-29 09:47:11.147957438 +0000 UTC m=+156.514187702" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.155354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" event={"ID":"4be461bc-d63f-4b94-951a-c8df10b91ab9","Type":"ContainerStarted","Data":"7a22f22d47972d9fee401ffdd0a22ce8c5024cca47606b353f3b4a0b9318de60"} Sep 29 09:47:11 crc kubenswrapper[4922]: W0929 09:47:11.168295 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67671cfa_2e1f_424a_bd8f_67e25492d817.slice/crio-4bca36c7dfc056f5aecfd26439699ecddbcfd9543ebf87db16e3b5207f8c7304 WatchSource:0}: Error finding container 4bca36c7dfc056f5aecfd26439699ecddbcfd9543ebf87db16e3b5207f8c7304: Status 404 returned error can't find the container with id 4bca36c7dfc056f5aecfd26439699ecddbcfd9543ebf87db16e3b5207f8c7304 Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.182979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.183301 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.683289231 +0000 UTC m=+157.049519495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.194937 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" event={"ID":"443a0f84-59dd-4acd-8299-f995b071562d","Type":"ContainerStarted","Data":"921f9660cc933de9292289dd20f1274299fe8671fcd2f1cf2dd0b8d8477dc10d"} Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.195427 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ftc8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.195457 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ftc8" podUID="70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.268315 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dw64l" podStartSLOduration=135.268297129 podStartE2EDuration="2m15.268297129s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.267315243 +0000 UTC m=+156.633545507" watchObservedRunningTime="2025-09-29 09:47:11.268297129 +0000 UTC m=+156.634527393" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.284380 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.285905 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.785890758 +0000 UTC m=+157.152121022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.313222 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.316750 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvp28"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.316789 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.387767 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.388067 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.888055724 +0000 UTC m=+157.254285988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.426443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fnbn6" podStartSLOduration=135.426415887 podStartE2EDuration="2m15.426415887s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.411244652 +0000 UTC m=+156.777474916" watchObservedRunningTime="2025-09-29 09:47:11.426415887 +0000 UTC m=+156.792646151" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.467214 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pwxh9" podStartSLOduration=135.467192934 podStartE2EDuration="2m15.467192934s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.46663986 +0000 UTC m=+156.832870144" watchObservedRunningTime="2025-09-29 09:47:11.467192934 +0000 UTC m=+156.833423198" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.488863 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.489244 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:11.989218402 +0000 UTC m=+157.355448666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.539717 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.539755 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zk7zr"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.539767 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jrk8f"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.550065 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.551472 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" podStartSLOduration=135.551439471 podStartE2EDuration="2m15.551439471s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.540031207 +0000 UTC m=+156.906261471" watchObservedRunningTime="2025-09-29 09:47:11.551439471 +0000 UTC m=+156.917669735" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.562982 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.568850 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn"] Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.577972 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jkk4k" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.590463 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.592291 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.092279152 +0000 UTC m=+157.458509416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.691176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.691555 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.191535659 +0000 UTC m=+157.557765923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.692250 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dcbw5" podStartSLOduration=6.692209127 podStartE2EDuration="6.692209127s" podCreationTimestamp="2025-09-29 09:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:11.682887318 +0000 UTC m=+157.049117592" watchObservedRunningTime="2025-09-29 09:47:11.692209127 +0000 UTC m=+157.058439391" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.792915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.793410 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.293392216 +0000 UTC m=+157.659622480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.892399 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:11 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:11 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:11 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.892803 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:11 crc kubenswrapper[4922]: I0929 09:47:11.894377 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:11 crc kubenswrapper[4922]: E0929 09:47:11.894684 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.394669018 +0000 UTC m=+157.760899282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.002473 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.002786 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.502773902 +0000 UTC m=+157.869004156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.105414 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.105577 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.605537123 +0000 UTC m=+157.971767387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.106092 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.106406 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.606391356 +0000 UTC m=+157.972621620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.195267 4922 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pnjfh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.195932 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" podUID="1e5a5810-dbd5-4c66-92dd-51d669dc6eb1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.208025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.208420 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.708401127 +0000 UTC m=+158.074631391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.216494 4922 generic.go:334] "Generic (PLEG): container finished" podID="d9227cd1-8691-4427-a216-1348be4c56fb" containerID="49f658ce573db1f29a68b7259549c2a286ddb5891d8d722bc8cea992c05920a8" exitCode=0 Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.216965 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" event={"ID":"d9227cd1-8691-4427-a216-1348be4c56fb","Type":"ContainerDied","Data":"49f658ce573db1f29a68b7259549c2a286ddb5891d8d722bc8cea992c05920a8"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.217022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" event={"ID":"d9227cd1-8691-4427-a216-1348be4c56fb","Type":"ContainerStarted","Data":"720b58aeec05df445e61a699b70f176fff5e54e4d1f40b10e0f360b3c4cb535c"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.219332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" event={"ID":"3174b863-8467-4dec-b1fd-602610f72a9f","Type":"ContainerStarted","Data":"d9dacfe5b8f8f6ed6d7342c3caefec58f882255b7611ef1c8cda5a8f9ecb6623"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.224105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" event={"ID":"a2f39a22-ce04-4af3-a76f-adbba71624b6","Type":"ContainerStarted","Data":"ed62e6efb187b6a9b3ef7831f515c4cb64170d84ea6d75f6b45959f56aaf1ac9"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.257056 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" event={"ID":"90a2c218-5fb1-4bcb-873d-a5667f73bc25","Type":"ContainerStarted","Data":"24a36c32a646151cf1a3667fffda7895bccfca02f6be1452c2dd4db4fd9049b4"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.309598 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.310369 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.810342676 +0000 UTC m=+158.176572940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.331487 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.331851 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.337237 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" event={"ID":"081e1e41-0f63-463a-b699-4c680f61122b","Type":"ContainerStarted","Data":"937bfc6c9ff483eead76c099eff3fd59cf0a02ab2c2a5493f9604a1c8941e219"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.341958 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" event={"ID":"3de1e108-32f2-42b5-b5e8-1e0337a8f973","Type":"ContainerStarted","Data":"7e10c8ee44a7a81d57f8b56cab2d8b8c3e827ca0752ee701990c4a366ff2d7cf"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.342011 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" event={"ID":"3de1e108-32f2-42b5-b5e8-1e0337a8f973","Type":"ContainerStarted","Data":"49e5a6d5791149209e78b6ac93237d341e0cd6356722bd5db47cb6ed591ea1a5"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.346224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" event={"ID":"4bfe4903-5df0-40a6-9216-56d1a6ae2f9b","Type":"ContainerStarted","Data":"97d6ff7f4ec1c55b2a5e240b8d06e5684c0354454aa002557ec3c117832e1c9a"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.367401 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.369242 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" event={"ID":"df443056-e7b1-48d4-92d5-0cb23872d4c3","Type":"ContainerStarted","Data":"a076342080f2cc23534a4c2722e4ad4c7612957be07b6d1af03b9444305b3559"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.376235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zk7zr" event={"ID":"aee42a43-42ec-4f54-b8c0-8721d2816541","Type":"ContainerStarted","Data":"6f331c2dd474aa8487a174bda9c4148389bcd209aa31efd3620d1a41684d71c9"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.386603 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnsrv" podStartSLOduration=135.38658129 podStartE2EDuration="2m15.38658129s" podCreationTimestamp="2025-09-29 09:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.383687643 +0000 UTC m=+157.749917907" watchObservedRunningTime="2025-09-29 09:47:12.38658129 +0000 UTC m=+157.752811554" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.392204 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" event={"ID":"9fb5ca8c-3962-48f5-af8b-13cf7a012c8d","Type":"ContainerStarted","Data":"6eaac73fc8345ca69a415d492927a98f0ad5b108003b289d841e8aeb50e4fd86"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.393334 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.410753 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.412679 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:12.912658137 +0000 UTC m=+158.278888431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.417917 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6xjmx" podStartSLOduration=136.417880095 podStartE2EDuration="2m16.417880095s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.416287563 +0000 UTC m=+157.782517837" watchObservedRunningTime="2025-09-29 09:47:12.417880095 +0000 UTC m=+157.784110359" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.422551 4922 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lkql5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.422652 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" podUID="9fb5ca8c-3962-48f5-af8b-13cf7a012c8d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.430800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" event={"ID":"7fdbfe98-a7d7-42eb-8d95-b1c6de178188","Type":"ContainerStarted","Data":"536b6a6d294c0ac36fb65240b0456e914abfb32418b1cb245344032255f9edba"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.441415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" event={"ID":"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1","Type":"ContainerStarted","Data":"56b84840a9786f814d5b411984fc6cad6dcb911c8f8785883f0bb2fd93c1ea39"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.485629 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-94448" podStartSLOduration=136.485602152 podStartE2EDuration="2m16.485602152s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.484615646 +0000 UTC m=+157.850845900" watchObservedRunningTime="2025-09-29 09:47:12.485602152 +0000 UTC m=+157.851832416" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.495917 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg855" event={"ID":"55bb4804-b952-4c2e-be70-a1ce082f8b6d","Type":"ContainerStarted","Data":"d0ace204453fa4b1e0c77e9afb5173142e0acce1916f1919556fe0c49f67a1a3"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.500329 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" event={"ID":"4370b7e1-a94b-463b-b286-66912465b7fe","Type":"ContainerStarted","Data":"8ccafc3e07faedc9a5dd8b6ab172a812aa5167e79f9397b59f6ab4545e0cf4e4"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.509555 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" event={"ID":"4be461bc-d63f-4b94-951a-c8df10b91ab9","Type":"ContainerStarted","Data":"b9371fde613cdb95473f6f8f0fc8fe9900d51a9de55a8c500dcfa31eb54af856"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.511955 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.512141 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" podStartSLOduration=136.51212407 podStartE2EDuration="2m16.51212407s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.508497683 +0000 UTC m=+157.874727947" watchObservedRunningTime="2025-09-29 09:47:12.51212407 +0000 UTC m=+157.878354334" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.513125 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.013112795 +0000 UTC m=+158.379343059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.534271 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" event={"ID":"67671cfa-2e1f-424a-bd8f-67e25492d817","Type":"ContainerStarted","Data":"4bca36c7dfc056f5aecfd26439699ecddbcfd9543ebf87db16e3b5207f8c7304"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.534735 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ptqfl" podStartSLOduration=136.534704502 podStartE2EDuration="2m16.534704502s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.531850456 +0000 UTC m=+157.898080730" watchObservedRunningTime="2025-09-29 09:47:12.534704502 +0000 UTC m=+157.900934766" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.552654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" event={"ID":"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907","Type":"ContainerStarted","Data":"4afbdeeb8c114316892b1f29a2dc164e580355692687d3d7cc65e9b420fd9786"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.552713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" event={"ID":"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907","Type":"ContainerStarted","Data":"41f81408d7b48c0564e7997298a11a866744f7f0b3c188d2ae01d102d31c0506"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.552727 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" event={"ID":"3c0d6dcf-4c77-4ed3-88bc-1a074f2fa907","Type":"ContainerStarted","Data":"b1576cc160ec5fb478b9d0368cc260f747741e5c8e1baa2ab63db6f68cac37f4"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.570821 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" podStartSLOduration=136.570805294 podStartE2EDuration="2m16.570805294s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.570038504 +0000 UTC m=+157.936268768" watchObservedRunningTime="2025-09-29 09:47:12.570805294 +0000 UTC m=+157.937035558" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.582690 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" event={"ID":"b2ea2f47-4732-47bd-9099-c503b5610f43","Type":"ContainerStarted","Data":"fc564d9017d861e209f30abf469654a4968dcd28bbde060d0ae04cd9368c88e5"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.592740 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxkh6" podStartSLOduration=136.592711789 podStartE2EDuration="2m16.592711789s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.592684998 +0000 UTC m=+157.958915272" watchObservedRunningTime="2025-09-29 09:47:12.592711789 +0000 UTC m=+157.958942053" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.613075 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" event={"ID":"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43","Type":"ContainerStarted","Data":"808f304fd850c3f5c3254c5b5e4d021a8d92fc2378c913ba290f80d42fddd826"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.613128 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" event={"ID":"6d12cf52-a4bb-4de2-8a47-d6e6bd452c43","Type":"ContainerStarted","Data":"e97b79af0c5d46b1b41fffde769b37a2298ffcc3f2e5203ce1faea2dfd342d24"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.613971 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.615781 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.115760554 +0000 UTC m=+158.481990878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.623127 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cftkt" podStartSLOduration=136.6231064 podStartE2EDuration="2m16.6231064s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.618917028 +0000 UTC m=+157.985147302" watchObservedRunningTime="2025-09-29 09:47:12.6231064 +0000 UTC m=+157.989336654" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.639243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" event={"ID":"dea0217e-c923-4045-9b4f-90a9eff30f93","Type":"ContainerStarted","Data":"ec3167903557a05ef798448660bf19caa176976700ad44b983e8ad5f08096e5f"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.651478 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vwz5w" podStartSLOduration=136.651449486 podStartE2EDuration="2m16.651449486s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.64861215 +0000 UTC m=+158.014842414" watchObservedRunningTime="2025-09-29 09:47:12.651449486 +0000 UTC m=+158.017679750" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.684699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" event={"ID":"8395ee11-ec83-495d-843f-f41cdb86d8bb","Type":"ContainerStarted","Data":"7cba56b91e3c073b69360d6d7b506312d72c82349001f1d0e7868c7d8087bb7a"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.688577 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" event={"ID":"a24b1532-d6be-4a8e-a843-742f6328c431","Type":"ContainerStarted","Data":"48650bacf61792045d993e8c47659b1e73eb28f0e910022ea3053b405786a628"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.688626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" event={"ID":"a24b1532-d6be-4a8e-a843-742f6328c431","Type":"ContainerStarted","Data":"816f61faade8d6cd4a209cd8a5611daeedb0178eaa11cb52a4c1a67c74fd0e69"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.696526 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvp28" event={"ID":"be0ebf80-6feb-4ac2-891e-c46c91dc7664","Type":"ContainerStarted","Data":"fd7c32acf6ba4a59cf1f59527a170a96417f9894c416cab4f8bc25e73b8b999c"} Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.696530 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dbc7l" podStartSLOduration=136.696511078 podStartE2EDuration="2m16.696511078s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.693136548 +0000 UTC m=+158.059366832" watchObservedRunningTime="2025-09-29 09:47:12.696511078 +0000 UTC m=+158.062741342" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.702554 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ftc8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.702620 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ftc8" podUID="70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.707569 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccl4g" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.717885 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.720244 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r8nms" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.722595 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.222573283 +0000 UTC m=+158.588803727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.726860 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" podStartSLOduration=132.726823367 podStartE2EDuration="2m12.726823367s" podCreationTimestamp="2025-09-29 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.72657019 +0000 UTC m=+158.092800454" watchObservedRunningTime="2025-09-29 09:47:12.726823367 +0000 UTC m=+158.093053621" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.778389 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qvp28" podStartSLOduration=7.778372102 podStartE2EDuration="7.778372102s" podCreationTimestamp="2025-09-29 09:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:12.777359305 +0000 UTC m=+158.143589569" watchObservedRunningTime="2025-09-29 09:47:12.778372102 +0000 UTC m=+158.144602366" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.821286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.822528 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.322508189 +0000 UTC m=+158.688738453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.895390 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:12 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:12 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:12 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.895439 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:12 crc kubenswrapper[4922]: I0929 09:47:12.923509 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:12 crc kubenswrapper[4922]: E0929 09:47:12.923937 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.423920695 +0000 UTC m=+158.790150969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.024467 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.024699 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.524673753 +0000 UTC m=+158.890904017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.025072 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.025381 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.525368581 +0000 UTC m=+158.891598845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.125953 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.126342 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.626314163 +0000 UTC m=+158.992544427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.227227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.227541 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.727525824 +0000 UTC m=+159.093756088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.329121 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.329341 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.829297698 +0000 UTC m=+159.195527962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.329665 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.330028 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.830019738 +0000 UTC m=+159.196250002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.431252 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.431433 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.931404992 +0000 UTC m=+159.297635256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.431485 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.431853 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:13.931845344 +0000 UTC m=+159.298075608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.438657 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.439645 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.442727 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.460954 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.532659 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.532883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcnl\" (UniqueName: \"kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.532917 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.532949 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.533069 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.033049124 +0000 UTC m=+159.399279448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.621213 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.622218 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.626686 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.633664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.633701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.634286 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.634577 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.134567032 +0000 UTC m=+159.500797296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.634704 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.634736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcnl\" (UniqueName: \"kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.634787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.640443 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.677693 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcnl\" (UniqueName: \"kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl\") pod \"certified-operators-p5kxg\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.718665 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlv5h" event={"ID":"67671cfa-2e1f-424a-bd8f-67e25492d817","Type":"ContainerStarted","Data":"cea2ed7dc3114c3210acc3c4c55d078e0b57a9e9ebd0612737ff5e83a62d1381"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.737556 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.737855 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwl8\" (UniqueName: \"kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.737886 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.737933 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.738051 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.238032432 +0000 UTC m=+159.604262696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.739045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" event={"ID":"d9227cd1-8691-4427-a216-1348be4c56fb","Type":"ContainerStarted","Data":"ef4ae8613fd2c71a73408ec56cc73d30e8692af9beeed3c647ac59ec1ad2bc55"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.740084 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.758199 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.762350 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg855" event={"ID":"55bb4804-b952-4c2e-be70-a1ce082f8b6d","Type":"ContainerStarted","Data":"a0b42724854e452637b7a349d470043b43e9981abcd40a38311ab83a05d7072f"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.771218 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" event={"ID":"4370b7e1-a94b-463b-b286-66912465b7fe","Type":"ContainerStarted","Data":"3b7e361b3695dc9a46acd3279428d04f43859aef2adfe8c4684ce327d7bfe1ab"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.771697 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" podStartSLOduration=138.771681119 podStartE2EDuration="2m18.771681119s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:13.770284622 +0000 UTC m=+159.136514896" watchObservedRunningTime="2025-09-29 09:47:13.771681119 +0000 UTC m=+159.137911383" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.781735 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" event={"ID":"081e1e41-0f63-463a-b699-4c680f61122b","Type":"ContainerStarted","Data":"f9b7d1341896d7c996f55848f24200138494a7ca3df906655bc946484034bce4"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.782902 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.785668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvp28" event={"ID":"be0ebf80-6feb-4ac2-891e-c46c91dc7664","Type":"ContainerStarted","Data":"24905f9afd3087149a70558d537df3572439f633308ea347c75c4e682b6b504b"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.790241 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5mr24 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.790281 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" podUID="081e1e41-0f63-463a-b699-4c680f61122b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.799281 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" event={"ID":"3de1e108-32f2-42b5-b5e8-1e0337a8f973","Type":"ContainerStarted","Data":"a4e3bb9b0ae6a6cc541561cc0e465afa22d2a428eb2b2a8156bd16c2d407c8a8"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.812207 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.827542 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" event={"ID":"3174b863-8467-4dec-b1fd-602610f72a9f","Type":"ContainerStarted","Data":"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.828365 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.831469 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gg855" podStartSLOduration=137.831441354 podStartE2EDuration="2m17.831441354s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:13.811545873 +0000 UTC m=+159.177776137" watchObservedRunningTime="2025-09-29 09:47:13.831441354 +0000 UTC m=+159.197671628" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.839352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.839443 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwl8\" (UniqueName: \"kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.839522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.839672 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.841229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" event={"ID":"3f7830e8-c7ff-4b90-b3a4-8f3c97ecc1e1","Type":"ContainerStarted","Data":"2fb10433ecf7f08e678daf139ec665e758840029260263aa5f4982f832c24095"} Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.846672 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.34665121 +0000 UTC m=+159.712881474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.850815 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.852093 4922 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wd7hc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" start-of-body= Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.852127 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.852718 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.862717 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.864721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.875469 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zk7zr" event={"ID":"aee42a43-42ec-4f54-b8c0-8721d2816541","Type":"ContainerStarted","Data":"56e0c010037aa6dbebbbc4963d9ee59f8f8db9ce14ef8175c6367e5d6fcb8734"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.875536 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zk7zr" event={"ID":"aee42a43-42ec-4f54-b8c0-8721d2816541","Type":"ContainerStarted","Data":"625343e78c71ca30d8d0fa1615a7c2770ae9b916873db9f3527f3284d5eeec13"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.886078 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2nmkn" podStartSLOduration=137.8860366 podStartE2EDuration="2m17.8860366s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:13.861577917 +0000 UTC m=+159.227808201" watchObservedRunningTime="2025-09-29 09:47:13.8860366 +0000 UTC m=+159.252266864" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.888967 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.889621 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" podStartSLOduration=137.889611305 podStartE2EDuration="2m17.889611305s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:13.888539437 +0000 UTC m=+159.254769701" watchObservedRunningTime="2025-09-29 09:47:13.889611305 +0000 UTC m=+159.255841569" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.904723 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwl8\" (UniqueName: \"kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8\") pod \"community-operators-ml9j4\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.909760 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:13 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:13 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:13 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.909861 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.910397 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" event={"ID":"a2f39a22-ce04-4af3-a76f-adbba71624b6","Type":"ContainerStarted","Data":"9550c4b895f772445ee1b5c4b39c05d8aeabbb02839b2eaedb310320c75bcca5"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.937174 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.940654 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.940997 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.941060 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9dw\" (UniqueName: \"kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.941126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:13 crc kubenswrapper[4922]: E0929 09:47:13.942349 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.442330972 +0000 UTC m=+159.808561236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.945412 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" podStartSLOduration=137.945384963 podStartE2EDuration="2m17.945384963s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:13.944194651 +0000 UTC m=+159.310424915" watchObservedRunningTime="2025-09-29 09:47:13.945384963 +0000 UTC m=+159.311615227" Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.974300 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" event={"ID":"8395ee11-ec83-495d-843f-f41cdb86d8bb","Type":"ContainerStarted","Data":"5345bc3140834d09b84071c7c35d6c7b87f626849e40ddd65626bc3522c831ea"} Sep 29 09:47:13 crc kubenswrapper[4922]: I0929 09:47:13.974693 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" event={"ID":"8395ee11-ec83-495d-843f-f41cdb86d8bb","Type":"ContainerStarted","Data":"fff70896d6d88d22039cf848d4d9ecaef086ca2e9a6d6a2f0ccd0e5b04287908"} Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.023103 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zk7zr" podStartSLOduration=9.023067896 podStartE2EDuration="9.023067896s" podCreationTimestamp="2025-09-29 09:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:14.009234237 +0000 UTC m=+159.375464511" watchObservedRunningTime="2025-09-29 09:47:14.023067896 +0000 UTC m=+159.389298160" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.024578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" event={"ID":"90a2c218-5fb1-4bcb-873d-a5667f73bc25","Type":"ContainerStarted","Data":"c9266472c64c7880c5f823515492d9ee04fb7020acd4eea98d264850767d0600"} Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.033350 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.034339 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.050288 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9dw\" (UniqueName: \"kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.050375 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.050398 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xnmw7" podStartSLOduration=138.050384074 podStartE2EDuration="2m18.050384074s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:14.049738787 +0000 UTC m=+159.415969051" watchObservedRunningTime="2025-09-29 09:47:14.050384074 +0000 UTC m=+159.416614338" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.050733 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.550717203 +0000 UTC m=+159.916947467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.052465 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.050428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.052606 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.052885 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.058768 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkql5" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.077138 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.107482 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9dw\" (UniqueName: \"kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw\") pod \"certified-operators-wvpd7\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.138701 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" podStartSLOduration=138.138672199 podStartE2EDuration="2m18.138672199s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:14.136026709 +0000 UTC m=+159.502256973" watchObservedRunningTime="2025-09-29 09:47:14.138672199 +0000 UTC m=+159.504902453" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.155858 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.156152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z88mp\" (UniqueName: \"kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.156248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.156287 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.157146 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.657127621 +0000 UTC m=+160.023357885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.207643 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qz8cj" podStartSLOduration=138.207627449 podStartE2EDuration="2m18.207627449s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:14.207249109 +0000 UTC m=+159.573479373" watchObservedRunningTime="2025-09-29 09:47:14.207627449 +0000 UTC m=+159.573857713" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.214935 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.258779 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z88mp\" (UniqueName: \"kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.258844 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.258868 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.258892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.259208 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.759195144 +0000 UTC m=+160.125425398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.259887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.264245 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.267310 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jrk8f" podStartSLOduration=138.26727647 podStartE2EDuration="2m18.26727647s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:14.258078165 +0000 UTC m=+159.624308429" watchObservedRunningTime="2025-09-29 09:47:14.26727647 +0000 UTC m=+159.633506734" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.291623 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z88mp\" (UniqueName: \"kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp\") pod \"community-operators-8z872\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.361382 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.361687 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.861667769 +0000 UTC m=+160.227898033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.408150 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.430130 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.463515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.464527 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:14.964512622 +0000 UTC m=+160.330742886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.566150 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.567041 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.567698 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.067679344 +0000 UTC m=+160.433909608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: W0929 09:47:14.581866 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5969f093_753c_4213_8312_4a5c43cc6519.slice/crio-ccb03c15f0082352e494b4dff7f1e16686da8c67e655dca1f9f3c0ea82bb6cbd WatchSource:0}: Error finding container ccb03c15f0082352e494b4dff7f1e16686da8c67e655dca1f9f3c0ea82bb6cbd: Status 404 returned error can't find the container with id ccb03c15f0082352e494b4dff7f1e16686da8c67e655dca1f9f3c0ea82bb6cbd Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.677608 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.678009 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.177995506 +0000 UTC m=+160.544225760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.783522 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.784356 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.284334033 +0000 UTC m=+160.650564297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.809658 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.891317 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.891883 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.391862872 +0000 UTC m=+160.758093136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.900386 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:14 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:14 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:14 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.900452 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.924285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.985603 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.986343 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:14 crc kubenswrapper[4922]: I0929 09:47:14.999277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:14 crc kubenswrapper[4922]: E0929 09:47:14.999657 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.499638297 +0000 UTC m=+160.865868561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.014308 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.017154 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.017531 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.082125 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.106739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.107210 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.107266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.106922 4922 generic.go:334] "Generic (PLEG): container finished" podID="0118e414-3687-49dc-acc6-454d86e13dfd" containerID="6ff52c69036020433aec88b6f7672a4b38af1f4938cf73782cea2fc7baf31013" exitCode=0 Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.106950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerDied","Data":"6ff52c69036020433aec88b6f7672a4b38af1f4938cf73782cea2fc7baf31013"} Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.107486 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerStarted","Data":"5142f1f7c0e0cb0c33aa0cf2ba63925f967dd80170aaf476921dbbfb04ac6860"} Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.108141 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.608126121 +0000 UTC m=+160.974356385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.110448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerStarted","Data":"7258d45f8185c24cfdabdb7664a5bc75b0bd8839ea41444aad546bc28497b582"} Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.123774 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.151553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" event={"ID":"a2f39a22-ce04-4af3-a76f-adbba71624b6","Type":"ContainerStarted","Data":"d4be79c58e41f6f3a33f00789eb88e07591dc0bfa4591b018863194dc82197ff"} Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.174565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerStarted","Data":"8e4cb69976d02373147859359220f3add96bd7a421ca0c602973b63697dca088"} Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.174619 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerStarted","Data":"ccb03c15f0082352e494b4dff7f1e16686da8c67e655dca1f9f3c0ea82bb6cbd"} Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.198115 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.198223 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.211086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.211397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.211514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.211875 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.711853718 +0000 UTC m=+161.078083982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.212602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.262154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.312982 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.321426 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.82140399 +0000 UTC m=+161.187634254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.378229 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jzsqd" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.414710 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.415067 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:15.915047369 +0000 UTC m=+161.281277643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.430751 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.431772 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.438215 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.446704 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.460552 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.520930 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.521401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.521450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngcp4\" (UniqueName: \"kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.521484 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.521914 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.021891249 +0000 UTC m=+161.388121503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.628436 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.628647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.628684 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngcp4\" (UniqueName: \"kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.628705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.630286 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.13024661 +0000 UTC m=+161.496476874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.630557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.630694 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.673503 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngcp4\" (UniqueName: \"kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4\") pod \"redhat-marketplace-8h8h5\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.731618 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.732994 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.2329778 +0000 UTC m=+161.599208064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.734820 4922 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.814426 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.823674 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.825929 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.834485 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.834656 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.334627022 +0000 UTC m=+161.700857286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.834704 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.835570 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.335559436 +0000 UTC m=+161.701789700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.849780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.852857 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.894164 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:15 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:15 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:15 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.894248 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.936422 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.936681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkv2b\" (UniqueName: \"kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.936724 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:15 crc kubenswrapper[4922]: I0929 09:47:15.936789 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:15 crc kubenswrapper[4922]: E0929 09:47:15.936912 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.43689387 +0000 UTC m=+161.803124134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.038230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkv2b\" (UniqueName: \"kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.038703 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.038788 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.038856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.039417 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.039758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.040183 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.540160724 +0000 UTC m=+161.906390988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.068707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkv2b\" (UniqueName: \"kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b\") pod \"redhat-marketplace-zbhzj\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.140236 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.140451 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.640415769 +0000 UTC m=+162.006646033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.140700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.141096 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.641080556 +0000 UTC m=+162.007310820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.165854 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.185383 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" event={"ID":"a2f39a22-ce04-4af3-a76f-adbba71624b6","Type":"ContainerStarted","Data":"69ba75b31f24b9195c34a1ad9f3264ec5522c3db6dbd5be5c81a11d747e1dfc4"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.185460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" event={"ID":"a2f39a22-ce04-4af3-a76f-adbba71624b6","Type":"ContainerStarted","Data":"6cbe0ca3b2f8380e7a2688675365707c53a41d2aa3e8bb857d5dcde2d073cd71"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.189603 4922 generic.go:334] "Generic (PLEG): container finished" podID="f9235b49-72e8-47d7-8959-1950443e6175" containerID="282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac" exitCode=0 Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.189687 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerDied","Data":"282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.189715 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerStarted","Data":"d107a585d2bed9117951a7011222132da8df2e0685f5935c418804d838b911f5"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.215202 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6tlvg" podStartSLOduration=11.215178613 podStartE2EDuration="11.215178613s" podCreationTimestamp="2025-09-29 09:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:16.212189393 +0000 UTC m=+161.578419657" watchObservedRunningTime="2025-09-29 09:47:16.215178613 +0000 UTC m=+161.581408867" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.219935 4922 generic.go:334] "Generic (PLEG): container finished" podID="5969f093-753c-4213-8312-4a5c43cc6519" containerID="8e4cb69976d02373147859359220f3add96bd7a421ca0c602973b63697dca088" exitCode=0 Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.220090 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerDied","Data":"8e4cb69976d02373147859359220f3add96bd7a421ca0c602973b63697dca088"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.236345 4922 generic.go:334] "Generic (PLEG): container finished" podID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerID="bb9f34106216246a6fb2b73ead81b75f749979d0867a7eff7ef239d5575b1ce4" exitCode=0 Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.236493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerDied","Data":"bb9f34106216246a6fb2b73ead81b75f749979d0867a7eff7ef239d5575b1ce4"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.240324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bcca377a-490a-4cbe-99be-418b2c1f42f1","Type":"ContainerStarted","Data":"8a60e512c683f1a4ca3f637da97bfcd3930f426f94d4a21df6ec99c1ebe14c0e"} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.242357 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.242711 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.742694637 +0000 UTC m=+162.108924901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.338713 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.349372 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.354798 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.854784217 +0000 UTC m=+162.221014481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: W0929 09:47:16.366908 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c6b228_c8fb_436e_bd7b_b2a0d78ae639.slice/crio-8b2e07e18e215989dde1d79a26f660682fbb333349adb1eea917dfa13561f422 WatchSource:0}: Error finding container 8b2e07e18e215989dde1d79a26f660682fbb333349adb1eea917dfa13561f422: Status 404 returned error can't find the container with id 8b2e07e18e215989dde1d79a26f660682fbb333349adb1eea917dfa13561f422 Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.427818 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.451856 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.452341 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.952261548 +0000 UTC m=+162.318491812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.452568 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.453316 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:16.953297525 +0000 UTC m=+162.319527789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.556329 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.556977 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:17.056886558 +0000 UTC m=+162.423116822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.557045 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.557978 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 09:47:17.057967147 +0000 UTC m=+162.424197411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hnjbq" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.621751 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.622959 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.625149 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.636629 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.636994 4922 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-29T09:47:15.73486325Z","Handler":null,"Name":""} Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.659531 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: E0929 09:47:16.665447 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 09:47:17.165407874 +0000 UTC m=+162.531638148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.670787 4922 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.670823 4922 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.766924 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.766998 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.767027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.767045 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6nr\" (UniqueName: \"kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.772007 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.772051 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.806316 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hnjbq\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.868135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.868409 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.868439 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js6nr\" (UniqueName: \"kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.868596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.869143 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.869275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.877779 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.894909 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:16 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:16 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:16 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.894975 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.902814 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6nr\" (UniqueName: \"kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr\") pod \"redhat-operators-8q5p2\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.929014 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 09:47:16 crc kubenswrapper[4922]: I0929 09:47:16.980297 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.031907 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.033372 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.034810 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.174208 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.174305 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvtq\" (UniqueName: \"kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.174457 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.263422 4922 generic.go:334] "Generic (PLEG): container finished" podID="bcca377a-490a-4cbe-99be-418b2c1f42f1" containerID="b950e45112cd4221f8b1697cc8a3b22573f32a8bf21199203189ee9d953713e7" exitCode=0 Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.263900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bcca377a-490a-4cbe-99be-418b2c1f42f1","Type":"ContainerDied","Data":"b950e45112cd4221f8b1697cc8a3b22573f32a8bf21199203189ee9d953713e7"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.272454 4922 generic.go:334] "Generic (PLEG): container finished" podID="a24b1532-d6be-4a8e-a843-742f6328c431" containerID="48650bacf61792045d993e8c47659b1e73eb28f0e910022ea3053b405786a628" exitCode=0 Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.272506 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" event={"ID":"a24b1532-d6be-4a8e-a843-742f6328c431","Type":"ContainerDied","Data":"48650bacf61792045d993e8c47659b1e73eb28f0e910022ea3053b405786a628"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.275194 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.275235 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.275287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvtq\" (UniqueName: \"kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.277067 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.277116 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.280193 4922 generic.go:334] "Generic (PLEG): container finished" podID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerID="4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97" exitCode=0 Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.280259 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerDied","Data":"4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.280290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerStarted","Data":"88016d125f3d327171e769ab789038a05cf9bdc965b088f4deb6ad5273c637c1"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.288974 4922 generic.go:334] "Generic (PLEG): container finished" podID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerID="fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c" exitCode=0 Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.289098 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerDied","Data":"fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.289132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerStarted","Data":"8b2e07e18e215989dde1d79a26f660682fbb333349adb1eea917dfa13561f422"} Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.297817 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvtq\" (UniqueName: \"kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq\") pod \"redhat-operators-tfn4s\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.379338 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.379378 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.385243 4922 patch_prober.go:28] interesting pod/console-f9d7485db-4zgtm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.385322 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4zgtm" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.388914 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:47:17 crc kubenswrapper[4922]: W0929 09:47:17.410225 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40832f2_23b9_4c87_8221_f5b790062ebd.slice/crio-ee0f7317fe3010e30b6b6d8154c0941f894882f518c729a8801b5c780d0fb91b WatchSource:0}: Error finding container ee0f7317fe3010e30b6b6d8154c0941f894882f518c729a8801b5c780d0fb91b: Status 404 returned error can't find the container with id ee0f7317fe3010e30b6b6d8154c0941f894882f518c729a8801b5c780d0fb91b Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.432597 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.461540 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.512495 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:47:17 crc kubenswrapper[4922]: W0929 09:47:17.531539 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8af523a_fbfe_4e4a_9221_7f8a3a761ecc.slice/crio-d780246599051a76d7ccf7271eb47af0bf4dfcd1105b449e9921bcb0738730ab WatchSource:0}: Error finding container d780246599051a76d7ccf7271eb47af0bf4dfcd1105b449e9921bcb0738730ab: Status 404 returned error can't find the container with id d780246599051a76d7ccf7271eb47af0bf4dfcd1105b449e9921bcb0738730ab Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.755063 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:47:17 crc kubenswrapper[4922]: W0929 09:47:17.820002 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a801a0_1679_482d_bff1_c894c40022af.slice/crio-6b275d684053c66c0cec32f566c3947d00b0a33b55b2ebe27c18c50d6069e4e2 WatchSource:0}: Error finding container 6b275d684053c66c0cec32f566c3947d00b0a33b55b2ebe27c18c50d6069e4e2: Status 404 returned error can't find the container with id 6b275d684053c66c0cec32f566c3947d00b0a33b55b2ebe27c18c50d6069e4e2 Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.888917 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:17 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:17 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:17 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:17 crc kubenswrapper[4922]: I0929 09:47:17.889022 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.105063 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ftc8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.105546 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8ftc8" podUID="70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.106013 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ftc8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.106078 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ftc8" podUID="70b9ec8e-e3e5-45b5-abf4-3d89fa8530aa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.215810 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pnjfh" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.298028 4922 generic.go:334] "Generic (PLEG): container finished" podID="04a801a0-1679-482d-bff1-c894c40022af" containerID="a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3" exitCode=0 Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.298092 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerDied","Data":"a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.298119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerStarted","Data":"6b275d684053c66c0cec32f566c3947d00b0a33b55b2ebe27c18c50d6069e4e2"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.300773 4922 generic.go:334] "Generic (PLEG): container finished" podID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerID="4003a740d614bbc4447ab6a72936f94d7e5f380ee81e4131a6101ab30d37254f" exitCode=0 Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.300923 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerDied","Data":"4003a740d614bbc4447ab6a72936f94d7e5f380ee81e4131a6101ab30d37254f"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.300974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerStarted","Data":"ee0f7317fe3010e30b6b6d8154c0941f894882f518c729a8801b5c780d0fb91b"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.312148 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" event={"ID":"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc","Type":"ContainerStarted","Data":"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.312199 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.312210 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" event={"ID":"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc","Type":"ContainerStarted","Data":"d780246599051a76d7ccf7271eb47af0bf4dfcd1105b449e9921bcb0738730ab"} Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.584499 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.584557 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.600307 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.629104 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" podStartSLOduration=142.629072187 podStartE2EDuration="2m22.629072187s" podCreationTimestamp="2025-09-29 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:18.376848148 +0000 UTC m=+163.743078412" watchObservedRunningTime="2025-09-29 09:47:18.629072187 +0000 UTC m=+163.995302451" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.649821 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.686648 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.830227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume\") pod \"a24b1532-d6be-4a8e-a843-742f6328c431\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.830322 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl95g\" (UniqueName: \"kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g\") pod \"a24b1532-d6be-4a8e-a843-742f6328c431\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.830363 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access\") pod \"bcca377a-490a-4cbe-99be-418b2c1f42f1\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.830512 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir\") pod \"bcca377a-490a-4cbe-99be-418b2c1f42f1\" (UID: \"bcca377a-490a-4cbe-99be-418b2c1f42f1\") " Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.830540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume\") pod \"a24b1532-d6be-4a8e-a843-742f6328c431\" (UID: \"a24b1532-d6be-4a8e-a843-742f6328c431\") " Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.831435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bcca377a-490a-4cbe-99be-418b2c1f42f1" (UID: "bcca377a-490a-4cbe-99be-418b2c1f42f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.831727 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume" (OuterVolumeSpecName: "config-volume") pod "a24b1532-d6be-4a8e-a843-742f6328c431" (UID: "a24b1532-d6be-4a8e-a843-742f6328c431"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.836187 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a24b1532-d6be-4a8e-a843-742f6328c431" (UID: "a24b1532-d6be-4a8e-a843-742f6328c431"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.837860 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g" (OuterVolumeSpecName: "kube-api-access-hl95g") pod "a24b1532-d6be-4a8e-a843-742f6328c431" (UID: "a24b1532-d6be-4a8e-a843-742f6328c431"). InnerVolumeSpecName "kube-api-access-hl95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.843187 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bcca377a-490a-4cbe-99be-418b2c1f42f1" (UID: "bcca377a-490a-4cbe-99be-418b2c1f42f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.883743 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.887528 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:18 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:18 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:18 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.887628 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.932239 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcca377a-490a-4cbe-99be-418b2c1f42f1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.932279 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a24b1532-d6be-4a8e-a843-742f6328c431-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.932287 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a24b1532-d6be-4a8e-a843-742f6328c431-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.932297 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl95g\" (UniqueName: \"kubernetes.io/projected/a24b1532-d6be-4a8e-a843-742f6328c431-kube-api-access-hl95g\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:18 crc kubenswrapper[4922]: I0929 09:47:18.932306 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcca377a-490a-4cbe-99be-418b2c1f42f1-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.325641 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.326448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bcca377a-490a-4cbe-99be-418b2c1f42f1","Type":"ContainerDied","Data":"8a60e512c683f1a4ca3f637da97bfcd3930f426f94d4a21df6ec99c1ebe14c0e"} Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.326492 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a60e512c683f1a4ca3f637da97bfcd3930f426f94d4a21df6ec99c1ebe14c0e" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.366733 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.371394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn" event={"ID":"a24b1532-d6be-4a8e-a843-742f6328c431","Type":"ContainerDied","Data":"816f61faade8d6cd4a209cd8a5611daeedb0178eaa11cb52a4c1a67c74fd0e69"} Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.371546 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="816f61faade8d6cd4a209cd8a5611daeedb0178eaa11cb52a4c1a67c74fd0e69" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.380877 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gg855" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.644314 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.653225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48a99f27-a7b4-466d-b130-026774744f7d-metrics-certs\") pod \"network-metrics-daemon-9p9s8\" (UID: \"48a99f27-a7b4-466d-b130-026774744f7d\") " pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.870617 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p9s8" Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.905957 4922 patch_prober.go:28] interesting pod/router-default-5444994796-fnbn6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 09:47:19 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Sep 29 09:47:19 crc kubenswrapper[4922]: [+]process-running ok Sep 29 09:47:19 crc kubenswrapper[4922]: healthz check failed Sep 29 09:47:19 crc kubenswrapper[4922]: I0929 09:47:19.906035 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fnbn6" podUID="272adf48-5f20-40e4-9bf0-563c630f9e12" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 09:47:20 crc kubenswrapper[4922]: I0929 09:47:20.204776 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9p9s8"] Sep 29 09:47:20 crc kubenswrapper[4922]: W0929 09:47:20.228350 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a99f27_a7b4_466d_b130_026774744f7d.slice/crio-c7aa52c1d0a174de42f1e783d85c6d30c67935e865745200777d96501fa0b964 WatchSource:0}: Error finding container c7aa52c1d0a174de42f1e783d85c6d30c67935e865745200777d96501fa0b964: Status 404 returned error can't find the container with id c7aa52c1d0a174de42f1e783d85c6d30c67935e865745200777d96501fa0b964 Sep 29 09:47:20 crc kubenswrapper[4922]: I0929 09:47:20.389055 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" event={"ID":"48a99f27-a7b4-466d-b130-026774744f7d","Type":"ContainerStarted","Data":"c7aa52c1d0a174de42f1e783d85c6d30c67935e865745200777d96501fa0b964"} Sep 29 09:47:20 crc kubenswrapper[4922]: I0929 09:47:20.887791 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:20 crc kubenswrapper[4922]: I0929 09:47:20.892777 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fnbn6" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.212451 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:47:21 crc kubenswrapper[4922]: E0929 09:47:21.213068 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24b1532-d6be-4a8e-a843-742f6328c431" containerName="collect-profiles" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.213087 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24b1532-d6be-4a8e-a843-742f6328c431" containerName="collect-profiles" Sep 29 09:47:21 crc kubenswrapper[4922]: E0929 09:47:21.213113 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcca377a-490a-4cbe-99be-418b2c1f42f1" containerName="pruner" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.213120 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcca377a-490a-4cbe-99be-418b2c1f42f1" containerName="pruner" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.213251 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcca377a-490a-4cbe-99be-418b2c1f42f1" containerName="pruner" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.213264 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24b1532-d6be-4a8e-a843-742f6328c431" containerName="collect-profiles" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.213891 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.216054 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.217031 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.217150 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.273111 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.273223 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.376553 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.376665 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.377268 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.397417 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.462365 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" event={"ID":"48a99f27-a7b4-466d-b130-026774744f7d","Type":"ContainerStarted","Data":"ee533f13b9b6c6550cd6b4517eb88d430be06da7f1458e6685780942d128b161"} Sep 29 09:47:21 crc kubenswrapper[4922]: I0929 09:47:21.567058 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:22 crc kubenswrapper[4922]: I0929 09:47:22.496571 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p9s8" event={"ID":"48a99f27-a7b4-466d-b130-026774744f7d","Type":"ContainerStarted","Data":"7633fad64bddc92f267b88f6db01c7551157b0a764571daff9a8492b5706eee2"} Sep 29 09:47:22 crc kubenswrapper[4922]: I0929 09:47:22.516719 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9p9s8" podStartSLOduration=147.516701675 podStartE2EDuration="2m27.516701675s" podCreationTimestamp="2025-09-29 09:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:47:22.513564101 +0000 UTC m=+167.879794365" watchObservedRunningTime="2025-09-29 09:47:22.516701675 +0000 UTC m=+167.882931939" Sep 29 09:47:23 crc kubenswrapper[4922]: I0929 09:47:23.927255 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zk7zr" Sep 29 09:47:27 crc kubenswrapper[4922]: I0929 09:47:27.359229 4922 patch_prober.go:28] interesting pod/console-f9d7485db-4zgtm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 29 09:47:27 crc kubenswrapper[4922]: I0929 09:47:27.359564 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4zgtm" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 29 09:47:28 crc kubenswrapper[4922]: I0929 09:47:28.111505 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8ftc8" Sep 29 09:47:29 crc kubenswrapper[4922]: I0929 09:47:29.070333 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:47:29 crc kubenswrapper[4922]: I0929 09:47:29.070498 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:47:36 crc kubenswrapper[4922]: I0929 09:47:36.890386 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:47:37 crc kubenswrapper[4922]: I0929 09:47:37.363327 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:37 crc kubenswrapper[4922]: I0929 09:47:37.367224 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.744432 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.746852 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bzcnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p5kxg_openshift-marketplace(0118e414-3687-49dc-acc6-454d86e13dfd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.748149 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p5kxg" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.812627 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.813265 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qx9dw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wvpd7_openshift-marketplace(cfcb9837-d910-452a-9c93-e842d5c6bcde): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 29 09:47:41 crc kubenswrapper[4922]: E0929 09:47:41.814762 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wvpd7" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.236514 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 09:47:42 crc kubenswrapper[4922]: W0929 09:47:42.263165 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda1837cd0_2be3_46d2_b9b5_9d5936adc79a.slice/crio-8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513 WatchSource:0}: Error finding container 8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513: Status 404 returned error can't find the container with id 8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513 Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.636422 4922 generic.go:334] "Generic (PLEG): container finished" podID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerID="f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374" exitCode=0 Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.636839 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerDied","Data":"f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.640951 4922 generic.go:334] "Generic (PLEG): container finished" podID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerID="3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038" exitCode=0 Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.641058 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerDied","Data":"3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.644405 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerStarted","Data":"ef89d55718b65727659b41dbb7c1c3610ecc083347317ae994d0964b4a57c455"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.648132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerStarted","Data":"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.650281 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1837cd0-2be3-46d2-b9b5-9d5936adc79a","Type":"ContainerStarted","Data":"8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.658067 4922 generic.go:334] "Generic (PLEG): container finished" podID="f9235b49-72e8-47d7-8959-1950443e6175" containerID="f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0" exitCode=0 Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.658727 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerDied","Data":"f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0"} Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.667587 4922 generic.go:334] "Generic (PLEG): container finished" podID="5969f093-753c-4213-8312-4a5c43cc6519" containerID="2375638b8081984de19afbeb199f29160995853d697950fff8cc7369c7adf3ff" exitCode=0 Sep 29 09:47:42 crc kubenswrapper[4922]: I0929 09:47:42.667713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerDied","Data":"2375638b8081984de19afbeb199f29160995853d697950fff8cc7369c7adf3ff"} Sep 29 09:47:42 crc kubenswrapper[4922]: E0929 09:47:42.670218 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p5kxg" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" Sep 29 09:47:42 crc kubenswrapper[4922]: E0929 09:47:42.670286 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wvpd7" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.509442 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.679780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerStarted","Data":"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.685933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerStarted","Data":"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.688629 4922 generic.go:334] "Generic (PLEG): container finished" podID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerID="ef89d55718b65727659b41dbb7c1c3610ecc083347317ae994d0964b4a57c455" exitCode=0 Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.688694 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerDied","Data":"ef89d55718b65727659b41dbb7c1c3610ecc083347317ae994d0964b4a57c455"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.692697 4922 generic.go:334] "Generic (PLEG): container finished" podID="04a801a0-1679-482d-bff1-c894c40022af" containerID="f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49" exitCode=0 Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.692884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerDied","Data":"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.696812 4922 generic.go:334] "Generic (PLEG): container finished" podID="a1837cd0-2be3-46d2-b9b5-9d5936adc79a" containerID="96d02da8f5a62b45fc97ca075c8638470a646bb12de7268912912359fd7ec63d" exitCode=0 Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.696954 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1837cd0-2be3-46d2-b9b5-9d5936adc79a","Type":"ContainerDied","Data":"96d02da8f5a62b45fc97ca075c8638470a646bb12de7268912912359fd7ec63d"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.701015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerStarted","Data":"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.711284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerStarted","Data":"9bfa2e3a9e9bb293868e27ddb81d19f902f5510f3c9f6ac875c61065f3c224bc"} Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.734054 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbhzj" podStartSLOduration=2.737673538 podStartE2EDuration="28.734029334s" podCreationTimestamp="2025-09-29 09:47:15 +0000 UTC" firstStartedPulling="2025-09-29 09:47:17.309992378 +0000 UTC m=+162.676222642" lastFinishedPulling="2025-09-29 09:47:43.306348174 +0000 UTC m=+188.672578438" observedRunningTime="2025-09-29 09:47:43.706322055 +0000 UTC m=+189.072552339" watchObservedRunningTime="2025-09-29 09:47:43.734029334 +0000 UTC m=+189.100259598" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.735823 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8h8h5" podStartSLOduration=2.763864226 podStartE2EDuration="28.735816261s" podCreationTimestamp="2025-09-29 09:47:15 +0000 UTC" firstStartedPulling="2025-09-29 09:47:17.309806224 +0000 UTC m=+162.676036488" lastFinishedPulling="2025-09-29 09:47:43.281758259 +0000 UTC m=+188.647988523" observedRunningTime="2025-09-29 09:47:43.733091809 +0000 UTC m=+189.099322083" watchObservedRunningTime="2025-09-29 09:47:43.735816261 +0000 UTC m=+189.102046525" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.762261 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8z872" podStartSLOduration=2.464238507 podStartE2EDuration="29.762238066s" podCreationTimestamp="2025-09-29 09:47:14 +0000 UTC" firstStartedPulling="2025-09-29 09:47:16.220532286 +0000 UTC m=+161.586762540" lastFinishedPulling="2025-09-29 09:47:43.518531835 +0000 UTC m=+188.884762099" observedRunningTime="2025-09-29 09:47:43.760651174 +0000 UTC m=+189.126881458" watchObservedRunningTime="2025-09-29 09:47:43.762238066 +0000 UTC m=+189.128468330" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.851897 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ml9j4" podStartSLOduration=2.6885513960000003 podStartE2EDuration="30.851855207s" podCreationTimestamp="2025-09-29 09:47:13 +0000 UTC" firstStartedPulling="2025-09-29 09:47:15.200733942 +0000 UTC m=+160.566964206" lastFinishedPulling="2025-09-29 09:47:43.364037753 +0000 UTC m=+188.730268017" observedRunningTime="2025-09-29 09:47:43.849316519 +0000 UTC m=+189.215546803" watchObservedRunningTime="2025-09-29 09:47:43.851855207 +0000 UTC m=+189.218085471" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.937693 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:43 crc kubenswrapper[4922]: I0929 09:47:43.937787 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.409685 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.410316 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.719028 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerStarted","Data":"0aa699a9f2c65703bca4acee6e87c3b91c01fa3af5d11ea787d67d1efb6486a4"} Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.723046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerStarted","Data":"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520"} Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.746339 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8q5p2" podStartSLOduration=2.941572274 podStartE2EDuration="28.746306968s" podCreationTimestamp="2025-09-29 09:47:16 +0000 UTC" firstStartedPulling="2025-09-29 09:47:18.309065681 +0000 UTC m=+163.675295945" lastFinishedPulling="2025-09-29 09:47:44.113800375 +0000 UTC m=+189.480030639" observedRunningTime="2025-09-29 09:47:44.739795824 +0000 UTC m=+190.106026088" watchObservedRunningTime="2025-09-29 09:47:44.746306968 +0000 UTC m=+190.112537232" Sep 29 09:47:44 crc kubenswrapper[4922]: I0929 09:47:44.758495 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tfn4s" podStartSLOduration=1.93921022 podStartE2EDuration="27.758467951s" podCreationTimestamp="2025-09-29 09:47:17 +0000 UTC" firstStartedPulling="2025-09-29 09:47:18.299484885 +0000 UTC m=+163.665715149" lastFinishedPulling="2025-09-29 09:47:44.118742616 +0000 UTC m=+189.484972880" observedRunningTime="2025-09-29 09:47:44.757536047 +0000 UTC m=+190.123766311" watchObservedRunningTime="2025-09-29 09:47:44.758467951 +0000 UTC m=+190.124698215" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.117088 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.139107 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ml9j4" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="registry-server" probeResult="failure" output=< Sep 29 09:47:45 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 09:47:45 crc kubenswrapper[4922]: > Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.283267 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir\") pod \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.283443 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access\") pod \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\" (UID: \"a1837cd0-2be3-46d2-b9b5-9d5936adc79a\") " Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.283462 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a1837cd0-2be3-46d2-b9b5-9d5936adc79a" (UID: "a1837cd0-2be3-46d2-b9b5-9d5936adc79a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.283696 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.294623 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a1837cd0-2be3-46d2-b9b5-9d5936adc79a" (UID: "a1837cd0-2be3-46d2-b9b5-9d5936adc79a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.385020 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1837cd0-2be3-46d2-b9b5-9d5936adc79a-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.463292 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8z872" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="registry-server" probeResult="failure" output=< Sep 29 09:47:45 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 09:47:45 crc kubenswrapper[4922]: > Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.730873 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.731818 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1837cd0-2be3-46d2-b9b5-9d5936adc79a","Type":"ContainerDied","Data":"8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513"} Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.731884 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a272b22c3874a237d7643fb1310015a709cc0ab93bc615efe2a5d4ebc943513" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.816155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.816228 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:45 crc kubenswrapper[4922]: I0929 09:47:45.884251 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:46 crc kubenswrapper[4922]: I0929 09:47:46.167968 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:46 crc kubenswrapper[4922]: I0929 09:47:46.168017 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:46 crc kubenswrapper[4922]: I0929 09:47:46.211595 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:46 crc kubenswrapper[4922]: I0929 09:47:46.980989 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:46 crc kubenswrapper[4922]: I0929 09:47:46.981798 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:47 crc kubenswrapper[4922]: I0929 09:47:47.433128 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:47 crc kubenswrapper[4922]: I0929 09:47:47.433749 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:48 crc kubenswrapper[4922]: I0929 09:47:48.025995 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8q5p2" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="registry-server" probeResult="failure" output=< Sep 29 09:47:48 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 09:47:48 crc kubenswrapper[4922]: > Sep 29 09:47:48 crc kubenswrapper[4922]: I0929 09:47:48.474454 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tfn4s" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="registry-server" probeResult="failure" output=< Sep 29 09:47:48 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 09:47:48 crc kubenswrapper[4922]: > Sep 29 09:47:48 crc kubenswrapper[4922]: I0929 09:47:48.898770 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l9gl6" Sep 29 09:47:53 crc kubenswrapper[4922]: I0929 09:47:53.993415 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:54 crc kubenswrapper[4922]: I0929 09:47:54.037348 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:47:54 crc kubenswrapper[4922]: I0929 09:47:54.457632 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:54 crc kubenswrapper[4922]: I0929 09:47:54.498591 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:55 crc kubenswrapper[4922]: I0929 09:47:55.229429 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:55 crc kubenswrapper[4922]: I0929 09:47:55.793268 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8z872" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="registry-server" containerID="cri-o://d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c" gracePeriod=2 Sep 29 09:47:55 crc kubenswrapper[4922]: I0929 09:47:55.868518 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.260295 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.308031 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.458453 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities\") pod \"f9235b49-72e8-47d7-8959-1950443e6175\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.458551 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content\") pod \"f9235b49-72e8-47d7-8959-1950443e6175\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.458733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z88mp\" (UniqueName: \"kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp\") pod \"f9235b49-72e8-47d7-8959-1950443e6175\" (UID: \"f9235b49-72e8-47d7-8959-1950443e6175\") " Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.460056 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities" (OuterVolumeSpecName: "utilities") pod "f9235b49-72e8-47d7-8959-1950443e6175" (UID: "f9235b49-72e8-47d7-8959-1950443e6175"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.462044 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.470959 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp" (OuterVolumeSpecName: "kube-api-access-z88mp") pod "f9235b49-72e8-47d7-8959-1950443e6175" (UID: "f9235b49-72e8-47d7-8959-1950443e6175"). InnerVolumeSpecName "kube-api-access-z88mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.513776 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9235b49-72e8-47d7-8959-1950443e6175" (UID: "f9235b49-72e8-47d7-8959-1950443e6175"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.563567 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9235b49-72e8-47d7-8959-1950443e6175-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.564131 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z88mp\" (UniqueName: \"kubernetes.io/projected/f9235b49-72e8-47d7-8959-1950443e6175-kube-api-access-z88mp\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.804027 4922 generic.go:334] "Generic (PLEG): container finished" podID="f9235b49-72e8-47d7-8959-1950443e6175" containerID="d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c" exitCode=0 Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.804087 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerDied","Data":"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c"} Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.804124 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z872" event={"ID":"f9235b49-72e8-47d7-8959-1950443e6175","Type":"ContainerDied","Data":"d107a585d2bed9117951a7011222132da8df2e0685f5935c418804d838b911f5"} Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.804150 4922 scope.go:117] "RemoveContainer" containerID="d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.804166 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z872" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.835730 4922 scope.go:117] "RemoveContainer" containerID="f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.836736 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.843185 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8z872"] Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.860184 4922 scope.go:117] "RemoveContainer" containerID="282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.878255 4922 scope.go:117] "RemoveContainer" containerID="d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c" Sep 29 09:47:56 crc kubenswrapper[4922]: E0929 09:47:56.879263 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c\": container with ID starting with d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c not found: ID does not exist" containerID="d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.879355 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c"} err="failed to get container status \"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c\": rpc error: code = NotFound desc = could not find container \"d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c\": container with ID starting with d436cc7b99fb6060d68badbac2203836c3e679f92395a13553eec7b301a3160c not found: ID does not exist" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.879450 4922 scope.go:117] "RemoveContainer" containerID="f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0" Sep 29 09:47:56 crc kubenswrapper[4922]: E0929 09:47:56.879912 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0\": container with ID starting with f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0 not found: ID does not exist" containerID="f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.879949 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0"} err="failed to get container status \"f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0\": rpc error: code = NotFound desc = could not find container \"f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0\": container with ID starting with f6f5393feb64c43b7671e288988494a711abfaac33cfd57b813ff443e9460ac0 not found: ID does not exist" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.879968 4922 scope.go:117] "RemoveContainer" containerID="282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac" Sep 29 09:47:56 crc kubenswrapper[4922]: E0929 09:47:56.880651 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac\": container with ID starting with 282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac not found: ID does not exist" containerID="282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac" Sep 29 09:47:56 crc kubenswrapper[4922]: I0929 09:47:56.880682 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac"} err="failed to get container status \"282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac\": rpc error: code = NotFound desc = could not find container \"282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac\": container with ID starting with 282b2fef7ce79243262b0dca51cb23d9ba91141420d1d65550e12ec4147406ac not found: ID does not exist" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.037218 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.080823 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.459466 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9235b49-72e8-47d7-8959-1950443e6175" path="/var/lib/kubelet/pods/f9235b49-72e8-47d7-8959-1950443e6175/volumes" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.480810 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.531676 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.811871 4922 generic.go:334] "Generic (PLEG): container finished" podID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerID="699d1f9930082facbef08d6c0a26817978b8bbd8c1275f3a0d005851c1750251" exitCode=0 Sep 29 09:47:57 crc kubenswrapper[4922]: I0929 09:47:57.811949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerDied","Data":"699d1f9930082facbef08d6c0a26817978b8bbd8c1275f3a0d005851c1750251"} Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.227055 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.227305 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbhzj" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="registry-server" containerID="cri-o://dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4" gracePeriod=2 Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.704288 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.797652 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content\") pod \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.797753 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities\") pod \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.797795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkv2b\" (UniqueName: \"kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b\") pod \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\" (UID: \"07ca8a28-80e6-4c48-9f34-f5f7567414e5\") " Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.800354 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities" (OuterVolumeSpecName: "utilities") pod "07ca8a28-80e6-4c48-9f34-f5f7567414e5" (UID: "07ca8a28-80e6-4c48-9f34-f5f7567414e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.810073 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b" (OuterVolumeSpecName: "kube-api-access-nkv2b") pod "07ca8a28-80e6-4c48-9f34-f5f7567414e5" (UID: "07ca8a28-80e6-4c48-9f34-f5f7567414e5"). InnerVolumeSpecName "kube-api-access-nkv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.818279 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07ca8a28-80e6-4c48-9f34-f5f7567414e5" (UID: "07ca8a28-80e6-4c48-9f34-f5f7567414e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.828029 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerStarted","Data":"11c9e36225a05cf7639d038488d694edd540ff5b76489e8f92f6353108ec6b46"} Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.831236 4922 generic.go:334] "Generic (PLEG): container finished" podID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerID="dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4" exitCode=0 Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.831284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerDied","Data":"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4"} Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.831316 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbhzj" event={"ID":"07ca8a28-80e6-4c48-9f34-f5f7567414e5","Type":"ContainerDied","Data":"88016d125f3d327171e769ab789038a05cf9bdc965b088f4deb6ad5273c637c1"} Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.831339 4922 scope.go:117] "RemoveContainer" containerID="dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.831471 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbhzj" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.852761 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wvpd7" podStartSLOduration=3.771640547 podStartE2EDuration="45.852723454s" podCreationTimestamp="2025-09-29 09:47:13 +0000 UTC" firstStartedPulling="2025-09-29 09:47:16.239277336 +0000 UTC m=+161.605507600" lastFinishedPulling="2025-09-29 09:47:58.320360243 +0000 UTC m=+203.686590507" observedRunningTime="2025-09-29 09:47:58.851154187 +0000 UTC m=+204.217384451" watchObservedRunningTime="2025-09-29 09:47:58.852723454 +0000 UTC m=+204.218953738" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.869334 4922 scope.go:117] "RemoveContainer" containerID="f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.877879 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.880964 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbhzj"] Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.885425 4922 scope.go:117] "RemoveContainer" containerID="4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.899516 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.899660 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07ca8a28-80e6-4c48-9f34-f5f7567414e5-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.899675 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkv2b\" (UniqueName: \"kubernetes.io/projected/07ca8a28-80e6-4c48-9f34-f5f7567414e5-kube-api-access-nkv2b\") on node \"crc\" DevicePath \"\"" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.908766 4922 scope.go:117] "RemoveContainer" containerID="dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4" Sep 29 09:47:58 crc kubenswrapper[4922]: E0929 09:47:58.909516 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4\": container with ID starting with dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4 not found: ID does not exist" containerID="dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.909581 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4"} err="failed to get container status \"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4\": rpc error: code = NotFound desc = could not find container \"dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4\": container with ID starting with dca9fbb4855bec2b5d55d99357b61836d6217310b2c0a7ed89546cc9065119a4 not found: ID does not exist" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.909620 4922 scope.go:117] "RemoveContainer" containerID="f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374" Sep 29 09:47:58 crc kubenswrapper[4922]: E0929 09:47:58.910112 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374\": container with ID starting with f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374 not found: ID does not exist" containerID="f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.910151 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374"} err="failed to get container status \"f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374\": rpc error: code = NotFound desc = could not find container \"f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374\": container with ID starting with f6f64905aad5f6a05dad63f6aab485a44827c6ed6a2d928ea3835e05e040b374 not found: ID does not exist" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.910168 4922 scope.go:117] "RemoveContainer" containerID="4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97" Sep 29 09:47:58 crc kubenswrapper[4922]: E0929 09:47:58.910489 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97\": container with ID starting with 4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97 not found: ID does not exist" containerID="4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97" Sep 29 09:47:58 crc kubenswrapper[4922]: I0929 09:47:58.910513 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97"} err="failed to get container status \"4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97\": rpc error: code = NotFound desc = could not find container \"4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97\": container with ID starting with 4b3e9a17b0b45dbc3bea3bb3f47dc88ef007fdd1a146347f24444762eaf09d97 not found: ID does not exist" Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.070458 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.070534 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.070590 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.071233 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.071300 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2" gracePeriod=600 Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.459114 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" path="/var/lib/kubelet/pods/07ca8a28-80e6-4c48-9f34-f5f7567414e5/volumes" Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.844575 4922 generic.go:334] "Generic (PLEG): container finished" podID="0118e414-3687-49dc-acc6-454d86e13dfd" containerID="26186d528300199298d91867e7b328523b9309270928cb06e27ac2a8a674f46f" exitCode=0 Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.844733 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerDied","Data":"26186d528300199298d91867e7b328523b9309270928cb06e27ac2a8a674f46f"} Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.849496 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2" exitCode=0 Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.849535 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2"} Sep 29 09:47:59 crc kubenswrapper[4922]: I0929 09:47:59.849561 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2"} Sep 29 09:48:00 crc kubenswrapper[4922]: I0929 09:48:00.830068 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:48:00 crc kubenswrapper[4922]: I0929 09:48:00.831158 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tfn4s" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="registry-server" containerID="cri-o://3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520" gracePeriod=2 Sep 29 09:48:00 crc kubenswrapper[4922]: I0929 09:48:00.859990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerStarted","Data":"85e162ba94498eb1b18f21a3728de980c5fafd4dd4095525ee6d99d7c64c5ee8"} Sep 29 09:48:00 crc kubenswrapper[4922]: I0929 09:48:00.882007 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5kxg" podStartSLOduration=2.427680811 podStartE2EDuration="47.881978247s" podCreationTimestamp="2025-09-29 09:47:13 +0000 UTC" firstStartedPulling="2025-09-29 09:47:15.123396469 +0000 UTC m=+160.489626733" lastFinishedPulling="2025-09-29 09:48:00.577693905 +0000 UTC m=+205.943924169" observedRunningTime="2025-09-29 09:48:00.877736353 +0000 UTC m=+206.243966617" watchObservedRunningTime="2025-09-29 09:48:00.881978247 +0000 UTC m=+206.248208511" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.284952 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.437380 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content\") pod \"04a801a0-1679-482d-bff1-c894c40022af\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.437463 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwvtq\" (UniqueName: \"kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq\") pod \"04a801a0-1679-482d-bff1-c894c40022af\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.437505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities\") pod \"04a801a0-1679-482d-bff1-c894c40022af\" (UID: \"04a801a0-1679-482d-bff1-c894c40022af\") " Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.438568 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities" (OuterVolumeSpecName: "utilities") pod "04a801a0-1679-482d-bff1-c894c40022af" (UID: "04a801a0-1679-482d-bff1-c894c40022af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.449775 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq" (OuterVolumeSpecName: "kube-api-access-nwvtq") pod "04a801a0-1679-482d-bff1-c894c40022af" (UID: "04a801a0-1679-482d-bff1-c894c40022af"). InnerVolumeSpecName "kube-api-access-nwvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.539287 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.539350 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwvtq\" (UniqueName: \"kubernetes.io/projected/04a801a0-1679-482d-bff1-c894c40022af-kube-api-access-nwvtq\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.547318 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04a801a0-1679-482d-bff1-c894c40022af" (UID: "04a801a0-1679-482d-bff1-c894c40022af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.640629 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a801a0-1679-482d-bff1-c894c40022af-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.868347 4922 generic.go:334] "Generic (PLEG): container finished" podID="04a801a0-1679-482d-bff1-c894c40022af" containerID="3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520" exitCode=0 Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.868419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerDied","Data":"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520"} Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.868470 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfn4s" event={"ID":"04a801a0-1679-482d-bff1-c894c40022af","Type":"ContainerDied","Data":"6b275d684053c66c0cec32f566c3947d00b0a33b55b2ebe27c18c50d6069e4e2"} Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.868494 4922 scope.go:117] "RemoveContainer" containerID="3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.869594 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfn4s" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.890513 4922 scope.go:117] "RemoveContainer" containerID="f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.910667 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.915853 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tfn4s"] Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.928728 4922 scope.go:117] "RemoveContainer" containerID="a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.945210 4922 scope.go:117] "RemoveContainer" containerID="3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520" Sep 29 09:48:01 crc kubenswrapper[4922]: E0929 09:48:01.948385 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520\": container with ID starting with 3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520 not found: ID does not exist" containerID="3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.948445 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520"} err="failed to get container status \"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520\": rpc error: code = NotFound desc = could not find container \"3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520\": container with ID starting with 3d81b2fa2b91fe1b6a707f00b9c09e98362844d730cbcb525481e3a91eab9520 not found: ID does not exist" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.948502 4922 scope.go:117] "RemoveContainer" containerID="f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49" Sep 29 09:48:01 crc kubenswrapper[4922]: E0929 09:48:01.948945 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49\": container with ID starting with f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49 not found: ID does not exist" containerID="f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.949011 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49"} err="failed to get container status \"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49\": rpc error: code = NotFound desc = could not find container \"f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49\": container with ID starting with f9b6caa831cc23452760298819cdc18785154547ad36d601c17942fee3fa6a49 not found: ID does not exist" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.949038 4922 scope.go:117] "RemoveContainer" containerID="a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3" Sep 29 09:48:01 crc kubenswrapper[4922]: E0929 09:48:01.949416 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3\": container with ID starting with a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3 not found: ID does not exist" containerID="a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3" Sep 29 09:48:01 crc kubenswrapper[4922]: I0929 09:48:01.949481 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3"} err="failed to get container status \"a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3\": rpc error: code = NotFound desc = could not find container \"a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3\": container with ID starting with a4053e500761f45cba8cf81161c5263bda24d32dc47a9b831207b00d59839ae3 not found: ID does not exist" Sep 29 09:48:03 crc kubenswrapper[4922]: I0929 09:48:03.461112 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a801a0-1679-482d-bff1-c894c40022af" path="/var/lib/kubelet/pods/04a801a0-1679-482d-bff1-c894c40022af/volumes" Sep 29 09:48:03 crc kubenswrapper[4922]: I0929 09:48:03.758616 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:48:03 crc kubenswrapper[4922]: I0929 09:48:03.758712 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:48:03 crc kubenswrapper[4922]: I0929 09:48:03.816155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:48:04 crc kubenswrapper[4922]: I0929 09:48:04.215613 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:04 crc kubenswrapper[4922]: I0929 09:48:04.216944 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:04 crc kubenswrapper[4922]: I0929 09:48:04.258037 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:04 crc kubenswrapper[4922]: I0929 09:48:04.943035 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:07 crc kubenswrapper[4922]: I0929 09:48:07.642455 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:48:08 crc kubenswrapper[4922]: I0929 09:48:08.630846 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:48:08 crc kubenswrapper[4922]: I0929 09:48:08.631920 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wvpd7" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="registry-server" containerID="cri-o://11c9e36225a05cf7639d038488d694edd540ff5b76489e8f92f6353108ec6b46" gracePeriod=2 Sep 29 09:48:08 crc kubenswrapper[4922]: I0929 09:48:08.938583 4922 generic.go:334] "Generic (PLEG): container finished" podID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerID="11c9e36225a05cf7639d038488d694edd540ff5b76489e8f92f6353108ec6b46" exitCode=0 Sep 29 09:48:08 crc kubenswrapper[4922]: I0929 09:48:08.938721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerDied","Data":"11c9e36225a05cf7639d038488d694edd540ff5b76489e8f92f6353108ec6b46"} Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.024588 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.154066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx9dw\" (UniqueName: \"kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw\") pod \"cfcb9837-d910-452a-9c93-e842d5c6bcde\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.154275 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content\") pod \"cfcb9837-d910-452a-9c93-e842d5c6bcde\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.154321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities\") pod \"cfcb9837-d910-452a-9c93-e842d5c6bcde\" (UID: \"cfcb9837-d910-452a-9c93-e842d5c6bcde\") " Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.155987 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities" (OuterVolumeSpecName: "utilities") pod "cfcb9837-d910-452a-9c93-e842d5c6bcde" (UID: "cfcb9837-d910-452a-9c93-e842d5c6bcde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.161007 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw" (OuterVolumeSpecName: "kube-api-access-qx9dw") pod "cfcb9837-d910-452a-9c93-e842d5c6bcde" (UID: "cfcb9837-d910-452a-9c93-e842d5c6bcde"). InnerVolumeSpecName "kube-api-access-qx9dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.208822 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfcb9837-d910-452a-9c93-e842d5c6bcde" (UID: "cfcb9837-d910-452a-9c93-e842d5c6bcde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.255601 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.255654 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcb9837-d910-452a-9c93-e842d5c6bcde-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.255664 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx9dw\" (UniqueName: \"kubernetes.io/projected/cfcb9837-d910-452a-9c93-e842d5c6bcde-kube-api-access-qx9dw\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.948305 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wvpd7" event={"ID":"cfcb9837-d910-452a-9c93-e842d5c6bcde","Type":"ContainerDied","Data":"7258d45f8185c24cfdabdb7664a5bc75b0bd8839ea41444aad546bc28497b582"} Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.948377 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wvpd7" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.948399 4922 scope.go:117] "RemoveContainer" containerID="11c9e36225a05cf7639d038488d694edd540ff5b76489e8f92f6353108ec6b46" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.972127 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.973432 4922 scope.go:117] "RemoveContainer" containerID="699d1f9930082facbef08d6c0a26817978b8bbd8c1275f3a0d005851c1750251" Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.975470 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wvpd7"] Sep 29 09:48:09 crc kubenswrapper[4922]: I0929 09:48:09.992099 4922 scope.go:117] "RemoveContainer" containerID="bb9f34106216246a6fb2b73ead81b75f749979d0867a7eff7ef239d5575b1ce4" Sep 29 09:48:11 crc kubenswrapper[4922]: I0929 09:48:11.459037 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" path="/var/lib/kubelet/pods/cfcb9837-d910-452a-9c93-e842d5c6bcde/volumes" Sep 29 09:48:13 crc kubenswrapper[4922]: I0929 09:48:13.820189 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:48:32 crc kubenswrapper[4922]: I0929 09:48:32.673225 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" containerName="oauth-openshift" containerID="cri-o://d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06" gracePeriod=15 Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.066783 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.102811 4922 generic.go:334] "Generic (PLEG): container finished" podID="3174b863-8467-4dec-b1fd-602610f72a9f" containerID="d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06" exitCode=0 Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.102969 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" event={"ID":"3174b863-8467-4dec-b1fd-602610f72a9f","Type":"ContainerDied","Data":"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06"} Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.103028 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.103097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wd7hc" event={"ID":"3174b863-8467-4dec-b1fd-602610f72a9f","Type":"ContainerDied","Data":"d9dacfe5b8f8f6ed6d7342c3caefec58f882255b7611ef1c8cda5a8f9ecb6623"} Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.103179 4922 scope.go:117] "RemoveContainer" containerID="d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113188 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-nsf9g"] Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113525 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113553 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113568 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113579 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113596 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113605 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113624 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113633 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113646 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113654 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113664 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113673 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113687 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1837cd0-2be3-46d2-b9b5-9d5936adc79a" containerName="pruner" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113699 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1837cd0-2be3-46d2-b9b5-9d5936adc79a" containerName="pruner" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113712 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113722 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113731 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" containerName="oauth-openshift" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113740 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" containerName="oauth-openshift" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113752 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113776 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="extract-utilities" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113791 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113800 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113812 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113821 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113854 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113864 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.113875 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.113883 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="extract-content" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114013 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1837cd0-2be3-46d2-b9b5-9d5936adc79a" containerName="pruner" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114033 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ca8a28-80e6-4c48-9f34-f5f7567414e5" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114045 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcb9837-d910-452a-9c93-e842d5c6bcde" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114056 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9235b49-72e8-47d7-8959-1950443e6175" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114068 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" containerName="oauth-openshift" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114080 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a801a0-1679-482d-bff1-c894c40022af" containerName="registry-server" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.114653 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.127639 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-nsf9g"] Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.141581 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.141634 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.141683 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.141757 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.141792 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143280 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143317 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143349 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143388 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143422 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k75pc\" (UniqueName: \"kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143445 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143471 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143519 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143580 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143600 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error\") pod \"3174b863-8467-4dec-b1fd-602610f72a9f\" (UID: \"3174b863-8467-4dec-b1fd-602610f72a9f\") " Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143895 4922 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3174b863-8467-4dec-b1fd-602610f72a9f-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.143927 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.144686 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.144740 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.148274 4922 scope.go:117] "RemoveContainer" containerID="d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.149349 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: E0929 09:48:33.149666 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06\": container with ID starting with d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06 not found: ID does not exist" containerID="d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.149803 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06"} err="failed to get container status \"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06\": rpc error: code = NotFound desc = could not find container \"d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06\": container with ID starting with d85b73a5c1d35b7fde6ab7588af5f6b4d6d3dea9a2a43f634ee7c1bef0e93e06 not found: ID does not exist" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.149893 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.150713 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.151238 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.153065 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.163049 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.163490 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.163666 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.163966 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc" (OuterVolumeSpecName: "kube-api-access-k75pc") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "kube-api-access-k75pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.164248 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3174b863-8467-4dec-b1fd-602610f72a9f" (UID: "3174b863-8467-4dec-b1fd-602610f72a9f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245631 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245720 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245812 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq7qk\" (UniqueName: \"kubernetes.io/projected/9696c530-b67f-4abb-bf5e-2ea907a80baa-kube-api-access-wq7qk\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245915 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.245968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246253 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246316 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246378 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-dir\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246653 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-policies\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246764 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246776 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246795 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246809 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246822 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246855 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246871 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246884 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246897 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246907 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k75pc\" (UniqueName: \"kubernetes.io/projected/3174b863-8467-4dec-b1fd-602610f72a9f-kube-api-access-k75pc\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246919 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3174b863-8467-4dec-b1fd-602610f72a9f-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.246935 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3174b863-8467-4dec-b1fd-602610f72a9f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.348726 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-policies\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.348859 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.348914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.348971 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349010 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq7qk\" (UniqueName: \"kubernetes.io/projected/9696c530-b67f-4abb-bf5e-2ea907a80baa-kube-api-access-wq7qk\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349140 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349192 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349231 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349331 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349381 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-dir\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349472 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.349992 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-dir\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.351098 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.351154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-audit-policies\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.351269 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.351479 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.353602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.353741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.354172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.355078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.355112 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.355891 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.356217 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.357245 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9696c530-b67f-4abb-bf5e-2ea907a80baa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.371159 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq7qk\" (UniqueName: \"kubernetes.io/projected/9696c530-b67f-4abb-bf5e-2ea907a80baa-kube-api-access-wq7qk\") pod \"oauth-openshift-7c4675448c-nsf9g\" (UID: \"9696c530-b67f-4abb-bf5e-2ea907a80baa\") " pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.446142 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.448413 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.465866 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wd7hc"] Sep 29 09:48:33 crc kubenswrapper[4922]: I0929 09:48:33.704403 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-nsf9g"] Sep 29 09:48:34 crc kubenswrapper[4922]: I0929 09:48:34.110303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" event={"ID":"9696c530-b67f-4abb-bf5e-2ea907a80baa","Type":"ContainerStarted","Data":"a013781b1b22eb38c2d2479ca4c20bba48ed77224bfe5fca8494deec8079c3de"} Sep 29 09:48:34 crc kubenswrapper[4922]: I0929 09:48:34.110866 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:34 crc kubenswrapper[4922]: I0929 09:48:34.110927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" event={"ID":"9696c530-b67f-4abb-bf5e-2ea907a80baa","Type":"ContainerStarted","Data":"a943a82c44c2eba5a8479d2fdedbb4b1c96f139d4cbb9d7425d01bc63dc758e5"} Sep 29 09:48:34 crc kubenswrapper[4922]: I0929 09:48:34.135262 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" podStartSLOduration=27.135229821 podStartE2EDuration="27.135229821s" podCreationTimestamp="2025-09-29 09:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:48:34.132129579 +0000 UTC m=+239.498359843" watchObservedRunningTime="2025-09-29 09:48:34.135229821 +0000 UTC m=+239.501460085" Sep 29 09:48:34 crc kubenswrapper[4922]: I0929 09:48:34.573541 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c4675448c-nsf9g" Sep 29 09:48:35 crc kubenswrapper[4922]: I0929 09:48:35.465072 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3174b863-8467-4dec-b1fd-602610f72a9f" path="/var/lib/kubelet/pods/3174b863-8467-4dec-b1fd-602610f72a9f/volumes" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.094255 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.096444 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5kxg" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="registry-server" containerID="cri-o://85e162ba94498eb1b18f21a3728de980c5fafd4dd4095525ee6d99d7c64c5ee8" gracePeriod=30 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.112893 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.113720 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ml9j4" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="registry-server" containerID="cri-o://9bfa2e3a9e9bb293868e27ddb81d19f902f5510f3c9f6ac875c61065f3c224bc" gracePeriod=30 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.129798 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.130265 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" podUID="081e1e41-0f63-463a-b699-4c680f61122b" containerName="marketplace-operator" containerID="cri-o://f9b7d1341896d7c996f55848f24200138494a7ca3df906655bc946484034bce4" gracePeriod=30 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.158015 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.162153 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.162685 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8q5p2" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="registry-server" containerID="cri-o://0aa699a9f2c65703bca4acee6e87c3b91c01fa3af5d11ea787d67d1efb6486a4" gracePeriod=30 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.171965 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fkw4t"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.172764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.175872 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fkw4t"] Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.354547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrpw\" (UniqueName: \"kubernetes.io/projected/69498aa6-9b16-42bd-97f7-f3f52b763788-kube-api-access-nkrpw\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.355417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.355572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.361552 4922 generic.go:334] "Generic (PLEG): container finished" podID="081e1e41-0f63-463a-b699-4c680f61122b" containerID="f9b7d1341896d7c996f55848f24200138494a7ca3df906655bc946484034bce4" exitCode=0 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.361629 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" event={"ID":"081e1e41-0f63-463a-b699-4c680f61122b","Type":"ContainerDied","Data":"f9b7d1341896d7c996f55848f24200138494a7ca3df906655bc946484034bce4"} Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.363910 4922 generic.go:334] "Generic (PLEG): container finished" podID="5969f093-753c-4213-8312-4a5c43cc6519" containerID="9bfa2e3a9e9bb293868e27ddb81d19f902f5510f3c9f6ac875c61065f3c224bc" exitCode=0 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.363991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerDied","Data":"9bfa2e3a9e9bb293868e27ddb81d19f902f5510f3c9f6ac875c61065f3c224bc"} Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.366537 4922 generic.go:334] "Generic (PLEG): container finished" podID="0118e414-3687-49dc-acc6-454d86e13dfd" containerID="85e162ba94498eb1b18f21a3728de980c5fafd4dd4095525ee6d99d7c64c5ee8" exitCode=0 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.366586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerDied","Data":"85e162ba94498eb1b18f21a3728de980c5fafd4dd4095525ee6d99d7c64c5ee8"} Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.369991 4922 generic.go:334] "Generic (PLEG): container finished" podID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerID="0aa699a9f2c65703bca4acee6e87c3b91c01fa3af5d11ea787d67d1efb6486a4" exitCode=0 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.370192 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8h8h5" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="registry-server" containerID="cri-o://c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5" gracePeriod=30 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.370489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerDied","Data":"0aa699a9f2c65703bca4acee6e87c3b91c01fa3af5d11ea787d67d1efb6486a4"} Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.456350 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.457410 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrpw\" (UniqueName: \"kubernetes.io/projected/69498aa6-9b16-42bd-97f7-f3f52b763788-kube-api-access-nkrpw\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.457437 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.460167 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.466943 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/69498aa6-9b16-42bd-97f7-f3f52b763788-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.480544 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrpw\" (UniqueName: \"kubernetes.io/projected/69498aa6-9b16-42bd-97f7-f3f52b763788-kube-api-access-nkrpw\") pod \"marketplace-operator-79b997595-fkw4t\" (UID: \"69498aa6-9b16-42bd-97f7-f3f52b763788\") " pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.571514 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.576188 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.580962 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.586045 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.645182 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760550 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca\") pod \"081e1e41-0f63-463a-b699-4c680f61122b\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760620 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctzdc\" (UniqueName: \"kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc\") pod \"081e1e41-0f63-463a-b699-4c680f61122b\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760651 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics\") pod \"081e1e41-0f63-463a-b699-4c680f61122b\" (UID: \"081e1e41-0f63-463a-b699-4c680f61122b\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760673 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content\") pod \"0118e414-3687-49dc-acc6-454d86e13dfd\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760711 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities\") pod \"5969f093-753c-4213-8312-4a5c43cc6519\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760734 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities\") pod \"c40832f2-23b9-4c87-8221-f5b790062ebd\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760752 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzcnl\" (UniqueName: \"kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl\") pod \"0118e414-3687-49dc-acc6-454d86e13dfd\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760779 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content\") pod \"5969f093-753c-4213-8312-4a5c43cc6519\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760811 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcwl8\" (UniqueName: \"kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8\") pod \"5969f093-753c-4213-8312-4a5c43cc6519\" (UID: \"5969f093-753c-4213-8312-4a5c43cc6519\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760872 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities\") pod \"0118e414-3687-49dc-acc6-454d86e13dfd\" (UID: \"0118e414-3687-49dc-acc6-454d86e13dfd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760894 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js6nr\" (UniqueName: \"kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr\") pod \"c40832f2-23b9-4c87-8221-f5b790062ebd\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.760913 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content\") pod \"c40832f2-23b9-4c87-8221-f5b790062ebd\" (UID: \"c40832f2-23b9-4c87-8221-f5b790062ebd\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.761817 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "081e1e41-0f63-463a-b699-4c680f61122b" (UID: "081e1e41-0f63-463a-b699-4c680f61122b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.762026 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities" (OuterVolumeSpecName: "utilities") pod "c40832f2-23b9-4c87-8221-f5b790062ebd" (UID: "c40832f2-23b9-4c87-8221-f5b790062ebd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.762241 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities" (OuterVolumeSpecName: "utilities") pod "0118e414-3687-49dc-acc6-454d86e13dfd" (UID: "0118e414-3687-49dc-acc6-454d86e13dfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.762809 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities" (OuterVolumeSpecName: "utilities") pod "5969f093-753c-4213-8312-4a5c43cc6519" (UID: "5969f093-753c-4213-8312-4a5c43cc6519"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.764541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc" (OuterVolumeSpecName: "kube-api-access-ctzdc") pod "081e1e41-0f63-463a-b699-4c680f61122b" (UID: "081e1e41-0f63-463a-b699-4c680f61122b"). InnerVolumeSpecName "kube-api-access-ctzdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.765795 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr" (OuterVolumeSpecName: "kube-api-access-js6nr") pod "c40832f2-23b9-4c87-8221-f5b790062ebd" (UID: "c40832f2-23b9-4c87-8221-f5b790062ebd"). InnerVolumeSpecName "kube-api-access-js6nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.769139 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8" (OuterVolumeSpecName: "kube-api-access-qcwl8") pod "5969f093-753c-4213-8312-4a5c43cc6519" (UID: "5969f093-753c-4213-8312-4a5c43cc6519"). InnerVolumeSpecName "kube-api-access-qcwl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.769190 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "081e1e41-0f63-463a-b699-4c680f61122b" (UID: "081e1e41-0f63-463a-b699-4c680f61122b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.772005 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl" (OuterVolumeSpecName: "kube-api-access-bzcnl") pod "0118e414-3687-49dc-acc6-454d86e13dfd" (UID: "0118e414-3687-49dc-acc6-454d86e13dfd"). InnerVolumeSpecName "kube-api-access-bzcnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.772635 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.816881 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0118e414-3687-49dc-acc6-454d86e13dfd" (UID: "0118e414-3687-49dc-acc6-454d86e13dfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.817965 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5969f093-753c-4213-8312-4a5c43cc6519" (UID: "5969f093-753c-4213-8312-4a5c43cc6519"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.862987 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863020 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctzdc\" (UniqueName: \"kubernetes.io/projected/081e1e41-0f63-463a-b699-4c680f61122b-kube-api-access-ctzdc\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863032 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863040 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/081e1e41-0f63-463a-b699-4c680f61122b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863052 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863063 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863073 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzcnl\" (UniqueName: \"kubernetes.io/projected/0118e414-3687-49dc-acc6-454d86e13dfd-kube-api-access-bzcnl\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863082 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5969f093-753c-4213-8312-4a5c43cc6519-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863091 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcwl8\" (UniqueName: \"kubernetes.io/projected/5969f093-753c-4213-8312-4a5c43cc6519-kube-api-access-qcwl8\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863099 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0118e414-3687-49dc-acc6-454d86e13dfd-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.863110 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js6nr\" (UniqueName: \"kubernetes.io/projected/c40832f2-23b9-4c87-8221-f5b790062ebd-kube-api-access-js6nr\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.877778 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c40832f2-23b9-4c87-8221-f5b790062ebd" (UID: "c40832f2-23b9-4c87-8221-f5b790062ebd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.884474 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fkw4t"] Sep 29 09:49:05 crc kubenswrapper[4922]: W0929 09:49:05.894441 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69498aa6_9b16_42bd_97f7_f3f52b763788.slice/crio-ea960baeecb5989d20bb374afbaada6c059d3efa0316c5b5e8a24f2da12cf992 WatchSource:0}: Error finding container ea960baeecb5989d20bb374afbaada6c059d3efa0316c5b5e8a24f2da12cf992: Status 404 returned error can't find the container with id ea960baeecb5989d20bb374afbaada6c059d3efa0316c5b5e8a24f2da12cf992 Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.963810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngcp4\" (UniqueName: \"kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4\") pod \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.964387 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content\") pod \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.964420 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities\") pod \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\" (UID: \"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639\") " Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.964558 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c40832f2-23b9-4c87-8221-f5b790062ebd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.965203 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities" (OuterVolumeSpecName: "utilities") pod "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" (UID: "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.967333 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4" (OuterVolumeSpecName: "kube-api-access-ngcp4") pod "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" (UID: "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639"). InnerVolumeSpecName "kube-api-access-ngcp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:49:05 crc kubenswrapper[4922]: I0929 09:49:05.981430 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" (UID: "d9c6b228-c8fb-436e-bd7b-b2a0d78ae639"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.065899 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.065968 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.065986 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngcp4\" (UniqueName: \"kubernetes.io/projected/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639-kube-api-access-ngcp4\") on node \"crc\" DevicePath \"\"" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.378782 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" event={"ID":"69498aa6-9b16-42bd-97f7-f3f52b763788","Type":"ContainerStarted","Data":"12b45c32148a9283d2d732f4c0a9d61793af9a421377fe718bcf348669efe521"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.378862 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" event={"ID":"69498aa6-9b16-42bd-97f7-f3f52b763788","Type":"ContainerStarted","Data":"ea960baeecb5989d20bb374afbaada6c059d3efa0316c5b5e8a24f2da12cf992"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.378891 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.381087 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fkw4t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.381235 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" podUID="69498aa6-9b16-42bd-97f7-f3f52b763788" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.382909 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5kxg" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.382884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5kxg" event={"ID":"0118e414-3687-49dc-acc6-454d86e13dfd","Type":"ContainerDied","Data":"5142f1f7c0e0cb0c33aa0cf2ba63925f967dd80170aaf476921dbbfb04ac6860"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.383111 4922 scope.go:117] "RemoveContainer" containerID="85e162ba94498eb1b18f21a3728de980c5fafd4dd4095525ee6d99d7c64c5ee8" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.399527 4922 generic.go:334] "Generic (PLEG): container finished" podID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerID="c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5" exitCode=0 Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.399773 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8h8h5" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.399795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerDied","Data":"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.399892 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8h8h5" event={"ID":"d9c6b228-c8fb-436e-bd7b-b2a0d78ae639","Type":"ContainerDied","Data":"8b2e07e18e215989dde1d79a26f660682fbb333349adb1eea917dfa13561f422"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.407008 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5p2" event={"ID":"c40832f2-23b9-4c87-8221-f5b790062ebd","Type":"ContainerDied","Data":"ee0f7317fe3010e30b6b6d8154c0941f894882f518c729a8801b5c780d0fb91b"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.407056 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5p2" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.410361 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.410434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5mr24" event={"ID":"081e1e41-0f63-463a-b699-4c680f61122b","Type":"ContainerDied","Data":"937bfc6c9ff483eead76c099eff3fd59cf0a02ab2c2a5493f9604a1c8941e219"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.416774 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" podStartSLOduration=1.41674346 podStartE2EDuration="1.41674346s" podCreationTimestamp="2025-09-29 09:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:49:06.41073099 +0000 UTC m=+271.776961254" watchObservedRunningTime="2025-09-29 09:49:06.41674346 +0000 UTC m=+271.782973724" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.419944 4922 scope.go:117] "RemoveContainer" containerID="26186d528300199298d91867e7b328523b9309270928cb06e27ac2a8a674f46f" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.421578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9j4" event={"ID":"5969f093-753c-4213-8312-4a5c43cc6519","Type":"ContainerDied","Data":"ccb03c15f0082352e494b4dff7f1e16686da8c67e655dca1f9f3c0ea82bb6cbd"} Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.421659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9j4" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.439855 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.446405 4922 scope.go:117] "RemoveContainer" containerID="6ff52c69036020433aec88b6f7672a4b38af1f4938cf73782cea2fc7baf31013" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.454807 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5kxg"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.472307 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.475297 4922 scope.go:117] "RemoveContainer" containerID="c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.478474 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8h8h5"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.494062 4922 scope.go:117] "RemoveContainer" containerID="3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.501698 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.506594 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ml9j4"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.518384 4922 scope.go:117] "RemoveContainer" containerID="fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.541719 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.544715 4922 scope.go:117] "RemoveContainer" containerID="c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.545679 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8q5p2"] Sep 29 09:49:06 crc kubenswrapper[4922]: E0929 09:49:06.546005 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5\": container with ID starting with c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5 not found: ID does not exist" containerID="c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.546054 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5"} err="failed to get container status \"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5\": rpc error: code = NotFound desc = could not find container \"c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5\": container with ID starting with c8939db0c55f86327085e2e2038cdc2e9295bd18b5be9ccc04295b4dae4c25b5 not found: ID does not exist" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.546089 4922 scope.go:117] "RemoveContainer" containerID="3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038" Sep 29 09:49:06 crc kubenswrapper[4922]: E0929 09:49:06.547249 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038\": container with ID starting with 3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038 not found: ID does not exist" containerID="3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.547293 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038"} err="failed to get container status \"3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038\": rpc error: code = NotFound desc = could not find container \"3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038\": container with ID starting with 3e584dcd69154f8018aa40384bdd542cbd38aaaaa9d70424c61237abf44f1038 not found: ID does not exist" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.547323 4922 scope.go:117] "RemoveContainer" containerID="fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c" Sep 29 09:49:06 crc kubenswrapper[4922]: E0929 09:49:06.550010 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c\": container with ID starting with fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c not found: ID does not exist" containerID="fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.550050 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c"} err="failed to get container status \"fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c\": rpc error: code = NotFound desc = could not find container \"fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c\": container with ID starting with fd4c8fbf8fc5e684159e4c1686a105d29b18ca43be14859f28a5cc2dd52baa4c not found: ID does not exist" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.550069 4922 scope.go:117] "RemoveContainer" containerID="0aa699a9f2c65703bca4acee6e87c3b91c01fa3af5d11ea787d67d1efb6486a4" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.564437 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.565907 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5mr24"] Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.571075 4922 scope.go:117] "RemoveContainer" containerID="ef89d55718b65727659b41dbb7c1c3610ecc083347317ae994d0964b4a57c455" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.587940 4922 scope.go:117] "RemoveContainer" containerID="4003a740d614bbc4447ab6a72936f94d7e5f380ee81e4131a6101ab30d37254f" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.605196 4922 scope.go:117] "RemoveContainer" containerID="f9b7d1341896d7c996f55848f24200138494a7ca3df906655bc946484034bce4" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.618717 4922 scope.go:117] "RemoveContainer" containerID="9bfa2e3a9e9bb293868e27ddb81d19f902f5510f3c9f6ac875c61065f3c224bc" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.632408 4922 scope.go:117] "RemoveContainer" containerID="2375638b8081984de19afbeb199f29160995853d697950fff8cc7369c7adf3ff" Sep 29 09:49:06 crc kubenswrapper[4922]: I0929 09:49:06.648889 4922 scope.go:117] "RemoveContainer" containerID="8e4cb69976d02373147859359220f3add96bd7a421ca0c602973b63697dca088" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.328579 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xd6n6"] Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329304 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329317 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329328 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329334 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329345 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329351 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329359 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329365 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329373 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329379 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329385 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329390 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329396 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329401 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329410 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329416 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329427 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081e1e41-0f63-463a-b699-4c680f61122b" containerName="marketplace-operator" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329433 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="081e1e41-0f63-463a-b699-4c680f61122b" containerName="marketplace-operator" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329441 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329446 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329454 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329461 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329467 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329473 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="extract-content" Sep 29 09:49:07 crc kubenswrapper[4922]: E0929 09:49:07.329481 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.329487 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="extract-utilities" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330067 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="081e1e41-0f63-463a-b699-4c680f61122b" containerName="marketplace-operator" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330080 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330086 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5969f093-753c-4213-8312-4a5c43cc6519" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330094 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330102 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" containerName="registry-server" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.330927 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.337252 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.342249 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd6n6"] Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.432424 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fkw4t" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.463286 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0118e414-3687-49dc-acc6-454d86e13dfd" path="/var/lib/kubelet/pods/0118e414-3687-49dc-acc6-454d86e13dfd/volumes" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.464176 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081e1e41-0f63-463a-b699-4c680f61122b" path="/var/lib/kubelet/pods/081e1e41-0f63-463a-b699-4c680f61122b/volumes" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.464621 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5969f093-753c-4213-8312-4a5c43cc6519" path="/var/lib/kubelet/pods/5969f093-753c-4213-8312-4a5c43cc6519/volumes" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.465790 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40832f2-23b9-4c87-8221-f5b790062ebd" path="/var/lib/kubelet/pods/c40832f2-23b9-4c87-8221-f5b790062ebd/volumes" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.466348 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c6b228-c8fb-436e-bd7b-b2a0d78ae639" path="/var/lib/kubelet/pods/d9c6b228-c8fb-436e-bd7b-b2a0d78ae639/volumes" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.483271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nn8v\" (UniqueName: \"kubernetes.io/projected/fbde2060-f54a-448a-a391-dbb6f2cf95e8-kube-api-access-2nn8v\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.483366 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-utilities\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.483445 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-catalog-content\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.533544 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5p6k"] Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.536807 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.540069 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.553523 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5p6k"] Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.584657 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-utilities\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.584740 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-catalog-content\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.584849 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nn8v\" (UniqueName: \"kubernetes.io/projected/fbde2060-f54a-448a-a391-dbb6f2cf95e8-kube-api-access-2nn8v\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.585393 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-utilities\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.585409 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbde2060-f54a-448a-a391-dbb6f2cf95e8-catalog-content\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.607459 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nn8v\" (UniqueName: \"kubernetes.io/projected/fbde2060-f54a-448a-a391-dbb6f2cf95e8-kube-api-access-2nn8v\") pod \"redhat-marketplace-xd6n6\" (UID: \"fbde2060-f54a-448a-a391-dbb6f2cf95e8\") " pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.686382 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-catalog-content\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.686466 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2t4\" (UniqueName: \"kubernetes.io/projected/42bd8ff0-fc46-401b-941c-26b4a171ba7d-kube-api-access-wq2t4\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.686489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-utilities\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.687254 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.788915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2t4\" (UniqueName: \"kubernetes.io/projected/42bd8ff0-fc46-401b-941c-26b4a171ba7d-kube-api-access-wq2t4\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.789006 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-utilities\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.789102 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-catalog-content\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.789629 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-utilities\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.791435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42bd8ff0-fc46-401b-941c-26b4a171ba7d-catalog-content\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.809652 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2t4\" (UniqueName: \"kubernetes.io/projected/42bd8ff0-fc46-401b-941c-26b4a171ba7d-kube-api-access-wq2t4\") pod \"redhat-operators-g5p6k\" (UID: \"42bd8ff0-fc46-401b-941c-26b4a171ba7d\") " pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.853472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:07 crc kubenswrapper[4922]: I0929 09:49:07.912311 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xd6n6"] Sep 29 09:49:07 crc kubenswrapper[4922]: W0929 09:49:07.923657 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbde2060_f54a_448a_a391_dbb6f2cf95e8.slice/crio-7ede0181cf0258ae1c25666dc7d874ebc2eeba3b94b43a1e73a971148ea6a76c WatchSource:0}: Error finding container 7ede0181cf0258ae1c25666dc7d874ebc2eeba3b94b43a1e73a971148ea6a76c: Status 404 returned error can't find the container with id 7ede0181cf0258ae1c25666dc7d874ebc2eeba3b94b43a1e73a971148ea6a76c Sep 29 09:49:08 crc kubenswrapper[4922]: I0929 09:49:08.270594 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5p6k"] Sep 29 09:49:08 crc kubenswrapper[4922]: W0929 09:49:08.280879 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42bd8ff0_fc46_401b_941c_26b4a171ba7d.slice/crio-46ec9be4b6f51121936bfa454eec2a6cd9260bdef19c7ace57374f3ab5ac1b45 WatchSource:0}: Error finding container 46ec9be4b6f51121936bfa454eec2a6cd9260bdef19c7ace57374f3ab5ac1b45: Status 404 returned error can't find the container with id 46ec9be4b6f51121936bfa454eec2a6cd9260bdef19c7ace57374f3ab5ac1b45 Sep 29 09:49:08 crc kubenswrapper[4922]: I0929 09:49:08.441070 4922 generic.go:334] "Generic (PLEG): container finished" podID="fbde2060-f54a-448a-a391-dbb6f2cf95e8" containerID="67383b306493993acbf3da8d3beba0a3d69cad7f520c5df52b70d6f0799c07a4" exitCode=0 Sep 29 09:49:08 crc kubenswrapper[4922]: I0929 09:49:08.441736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd6n6" event={"ID":"fbde2060-f54a-448a-a391-dbb6f2cf95e8","Type":"ContainerDied","Data":"67383b306493993acbf3da8d3beba0a3d69cad7f520c5df52b70d6f0799c07a4"} Sep 29 09:49:08 crc kubenswrapper[4922]: I0929 09:49:08.441774 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd6n6" event={"ID":"fbde2060-f54a-448a-a391-dbb6f2cf95e8","Type":"ContainerStarted","Data":"7ede0181cf0258ae1c25666dc7d874ebc2eeba3b94b43a1e73a971148ea6a76c"} Sep 29 09:49:08 crc kubenswrapper[4922]: I0929 09:49:08.445217 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5p6k" event={"ID":"42bd8ff0-fc46-401b-941c-26b4a171ba7d","Type":"ContainerStarted","Data":"46ec9be4b6f51121936bfa454eec2a6cd9260bdef19c7ace57374f3ab5ac1b45"} Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.455725 4922 generic.go:334] "Generic (PLEG): container finished" podID="42bd8ff0-fc46-401b-941c-26b4a171ba7d" containerID="04eff711f532ede67c0b74d8ba5a9d2594fbed97820b155a2e5628a7dfbe60cd" exitCode=0 Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.459766 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd6n6" event={"ID":"fbde2060-f54a-448a-a391-dbb6f2cf95e8","Type":"ContainerStarted","Data":"f106081342592f096e25a23147d7eaa96aee7cbaf3b3df43d33254cc8942af1b"} Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.459814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5p6k" event={"ID":"42bd8ff0-fc46-401b-941c-26b4a171ba7d","Type":"ContainerDied","Data":"04eff711f532ede67c0b74d8ba5a9d2594fbed97820b155a2e5628a7dfbe60cd"} Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.735112 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dckj"] Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.736946 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.741212 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.744900 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dckj"] Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.922609 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-utilities\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.922667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-catalog-content\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.922729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k782k\" (UniqueName: \"kubernetes.io/projected/c6981ca2-b9f8-4907-91f6-c470174f2d9e-kube-api-access-k782k\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.924023 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ghrvd"] Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.927651 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.928940 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ghrvd"] Sep 29 09:49:09 crc kubenswrapper[4922]: I0929 09:49:09.930805 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.024962 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-catalog-content\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025038 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k782k\" (UniqueName: \"kubernetes.io/projected/c6981ca2-b9f8-4907-91f6-c470174f2d9e-kube-api-access-k782k\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025080 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49dbq\" (UniqueName: \"kubernetes.io/projected/70c79985-eb4e-4c5a-a463-356e7c217ed0-kube-api-access-49dbq\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025104 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-utilities\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025153 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-utilities\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025179 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-catalog-content\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025749 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-catalog-content\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.025780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6981ca2-b9f8-4907-91f6-c470174f2d9e-utilities\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.047996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k782k\" (UniqueName: \"kubernetes.io/projected/c6981ca2-b9f8-4907-91f6-c470174f2d9e-kube-api-access-k782k\") pod \"certified-operators-2dckj\" (UID: \"c6981ca2-b9f8-4907-91f6-c470174f2d9e\") " pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.065311 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.125508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49dbq\" (UniqueName: \"kubernetes.io/projected/70c79985-eb4e-4c5a-a463-356e7c217ed0-kube-api-access-49dbq\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.125539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-utilities\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.125623 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-catalog-content\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.126134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-catalog-content\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.126431 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79985-eb4e-4c5a-a463-356e7c217ed0-utilities\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.151549 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49dbq\" (UniqueName: \"kubernetes.io/projected/70c79985-eb4e-4c5a-a463-356e7c217ed0-kube-api-access-49dbq\") pod \"community-operators-ghrvd\" (UID: \"70c79985-eb4e-4c5a-a463-356e7c217ed0\") " pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.248445 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.315346 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dckj"] Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.477815 4922 generic.go:334] "Generic (PLEG): container finished" podID="fbde2060-f54a-448a-a391-dbb6f2cf95e8" containerID="f106081342592f096e25a23147d7eaa96aee7cbaf3b3df43d33254cc8942af1b" exitCode=0 Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.477949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd6n6" event={"ID":"fbde2060-f54a-448a-a391-dbb6f2cf95e8","Type":"ContainerDied","Data":"f106081342592f096e25a23147d7eaa96aee7cbaf3b3df43d33254cc8942af1b"} Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.495093 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ghrvd"] Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.514546 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5p6k" event={"ID":"42bd8ff0-fc46-401b-941c-26b4a171ba7d","Type":"ContainerStarted","Data":"9706d26412a6684498de86d7183266a3fe6d25cf9f1b8f146108ca4ccd4a3c1d"} Sep 29 09:49:10 crc kubenswrapper[4922]: I0929 09:49:10.537035 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dckj" event={"ID":"c6981ca2-b9f8-4907-91f6-c470174f2d9e","Type":"ContainerStarted","Data":"b3c26b9f8d6efe3c7eaed1a3fe9c611a439ebd235a2110761a2678d8033e6ea0"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.543665 4922 generic.go:334] "Generic (PLEG): container finished" podID="70c79985-eb4e-4c5a-a463-356e7c217ed0" containerID="543f2656dff70fd356e1e743f3c2078678c553fece39e32cd4fdfd1d5fff5031" exitCode=0 Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.543816 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghrvd" event={"ID":"70c79985-eb4e-4c5a-a463-356e7c217ed0","Type":"ContainerDied","Data":"543f2656dff70fd356e1e743f3c2078678c553fece39e32cd4fdfd1d5fff5031"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.544928 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghrvd" event={"ID":"70c79985-eb4e-4c5a-a463-356e7c217ed0","Type":"ContainerStarted","Data":"197b3d948eeec8a7ea0c97452c4d673ffeba9e6f03a577f1830a63b4e378ff79"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.547793 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xd6n6" event={"ID":"fbde2060-f54a-448a-a391-dbb6f2cf95e8","Type":"ContainerStarted","Data":"8ad2276f985e0ac65061b6cb43696a7f102b3a7999c91c9c5040dd508b51d3cb"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.550209 4922 generic.go:334] "Generic (PLEG): container finished" podID="42bd8ff0-fc46-401b-941c-26b4a171ba7d" containerID="9706d26412a6684498de86d7183266a3fe6d25cf9f1b8f146108ca4ccd4a3c1d" exitCode=0 Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.550269 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5p6k" event={"ID":"42bd8ff0-fc46-401b-941c-26b4a171ba7d","Type":"ContainerDied","Data":"9706d26412a6684498de86d7183266a3fe6d25cf9f1b8f146108ca4ccd4a3c1d"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.556031 4922 generic.go:334] "Generic (PLEG): container finished" podID="c6981ca2-b9f8-4907-91f6-c470174f2d9e" containerID="156de8ac32837a6d5d4ca3b8d0528a3a05e73f050ec80b651bbf98c6f9322ba5" exitCode=0 Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.556113 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dckj" event={"ID":"c6981ca2-b9f8-4907-91f6-c470174f2d9e","Type":"ContainerDied","Data":"156de8ac32837a6d5d4ca3b8d0528a3a05e73f050ec80b651bbf98c6f9322ba5"} Sep 29 09:49:11 crc kubenswrapper[4922]: I0929 09:49:11.617888 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xd6n6" podStartSLOduration=2.104042416 podStartE2EDuration="4.61787259s" podCreationTimestamp="2025-09-29 09:49:07 +0000 UTC" firstStartedPulling="2025-09-29 09:49:08.44261018 +0000 UTC m=+273.808840444" lastFinishedPulling="2025-09-29 09:49:10.956440354 +0000 UTC m=+276.322670618" observedRunningTime="2025-09-29 09:49:11.616621165 +0000 UTC m=+276.982851429" watchObservedRunningTime="2025-09-29 09:49:11.61787259 +0000 UTC m=+276.984102854" Sep 29 09:49:12 crc kubenswrapper[4922]: I0929 09:49:12.562540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5p6k" event={"ID":"42bd8ff0-fc46-401b-941c-26b4a171ba7d","Type":"ContainerStarted","Data":"046688457c687ba90fb7a8fa9e94f04b6288dd55f99973f5f177cc0e3ae2157c"} Sep 29 09:49:12 crc kubenswrapper[4922]: I0929 09:49:12.565115 4922 generic.go:334] "Generic (PLEG): container finished" podID="c6981ca2-b9f8-4907-91f6-c470174f2d9e" containerID="a016f3883da219714cfa95f4bcc54a851ce8fedf539c286b1b6da98736209bbe" exitCode=0 Sep 29 09:49:12 crc kubenswrapper[4922]: I0929 09:49:12.565197 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dckj" event={"ID":"c6981ca2-b9f8-4907-91f6-c470174f2d9e","Type":"ContainerDied","Data":"a016f3883da219714cfa95f4bcc54a851ce8fedf539c286b1b6da98736209bbe"} Sep 29 09:49:12 crc kubenswrapper[4922]: I0929 09:49:12.583231 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5p6k" podStartSLOduration=3.101411825 podStartE2EDuration="5.583210356s" podCreationTimestamp="2025-09-29 09:49:07 +0000 UTC" firstStartedPulling="2025-09-29 09:49:09.457502421 +0000 UTC m=+274.823732685" lastFinishedPulling="2025-09-29 09:49:11.939300952 +0000 UTC m=+277.305531216" observedRunningTime="2025-09-29 09:49:12.580990873 +0000 UTC m=+277.947221147" watchObservedRunningTime="2025-09-29 09:49:12.583210356 +0000 UTC m=+277.949440630" Sep 29 09:49:13 crc kubenswrapper[4922]: I0929 09:49:13.573480 4922 generic.go:334] "Generic (PLEG): container finished" podID="70c79985-eb4e-4c5a-a463-356e7c217ed0" containerID="5cfa549ebe3769ac4d4f81f30a1f1a7056de881172b4583ea6508a106e909b8c" exitCode=0 Sep 29 09:49:13 crc kubenswrapper[4922]: I0929 09:49:13.574523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghrvd" event={"ID":"70c79985-eb4e-4c5a-a463-356e7c217ed0","Type":"ContainerDied","Data":"5cfa549ebe3769ac4d4f81f30a1f1a7056de881172b4583ea6508a106e909b8c"} Sep 29 09:49:14 crc kubenswrapper[4922]: I0929 09:49:14.585944 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dckj" event={"ID":"c6981ca2-b9f8-4907-91f6-c470174f2d9e","Type":"ContainerStarted","Data":"528454a1d532bab0aba5b5418790ad71a43a89c817b4b1d4778136b14e898a5f"} Sep 29 09:49:14 crc kubenswrapper[4922]: I0929 09:49:14.590266 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghrvd" event={"ID":"70c79985-eb4e-4c5a-a463-356e7c217ed0","Type":"ContainerStarted","Data":"1909ee5ec97199bf012d218d9f4ba05267a5814ff737be49c4094bcce13131e5"} Sep 29 09:49:14 crc kubenswrapper[4922]: I0929 09:49:14.631264 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dckj" podStartSLOduration=4.195706563 podStartE2EDuration="5.63124367s" podCreationTimestamp="2025-09-29 09:49:09 +0000 UTC" firstStartedPulling="2025-09-29 09:49:11.557435228 +0000 UTC m=+276.923665492" lastFinishedPulling="2025-09-29 09:49:12.992972335 +0000 UTC m=+278.359202599" observedRunningTime="2025-09-29 09:49:14.61379981 +0000 UTC m=+279.980030074" watchObservedRunningTime="2025-09-29 09:49:14.63124367 +0000 UTC m=+279.997473934" Sep 29 09:49:14 crc kubenswrapper[4922]: I0929 09:49:14.634636 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ghrvd" podStartSLOduration=2.966819437 podStartE2EDuration="5.634623766s" podCreationTimestamp="2025-09-29 09:49:09 +0000 UTC" firstStartedPulling="2025-09-29 09:49:11.54577689 +0000 UTC m=+276.912007154" lastFinishedPulling="2025-09-29 09:49:14.213581219 +0000 UTC m=+279.579811483" observedRunningTime="2025-09-29 09:49:14.630658664 +0000 UTC m=+279.996888948" watchObservedRunningTime="2025-09-29 09:49:14.634623766 +0000 UTC m=+280.000854030" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.689025 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.689994 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.740946 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.854051 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.854183 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:17 crc kubenswrapper[4922]: I0929 09:49:17.894304 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:18 crc kubenswrapper[4922]: I0929 09:49:18.656471 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xd6n6" Sep 29 09:49:18 crc kubenswrapper[4922]: I0929 09:49:18.660077 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5p6k" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.066874 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.066925 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.115853 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.250052 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.250116 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.290501 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.657795 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ghrvd" Sep 29 09:49:20 crc kubenswrapper[4922]: I0929 09:49:20.657927 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dckj" Sep 29 09:49:59 crc kubenswrapper[4922]: I0929 09:49:59.071415 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:49:59 crc kubenswrapper[4922]: I0929 09:49:59.072372 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:50:29 crc kubenswrapper[4922]: I0929 09:50:29.070420 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:50:29 crc kubenswrapper[4922]: I0929 09:50:29.071330 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.071036 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.072169 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.072253 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.073455 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.073531 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2" gracePeriod=600 Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.245452 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2" exitCode=0 Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.245508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2"} Sep 29 09:50:59 crc kubenswrapper[4922]: I0929 09:50:59.245547 4922 scope.go:117] "RemoveContainer" containerID="e34ca5137e1e43806655d40497c30504d0929e61ecf3fd7cfcfe1d2a57203db2" Sep 29 09:51:00 crc kubenswrapper[4922]: I0929 09:51:00.252124 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b"} Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.753301 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc627"] Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.755054 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.767730 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc627"] Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ttbz\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-kube-api-access-5ttbz\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896701 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-bound-sa-token\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896733 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-certificates\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896759 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a67e2f90-b7b3-4765-b944-d0e4f988fc54-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-trusted-ca\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a67e2f90-b7b3-4765-b944-d0e4f988fc54-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.896891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-tls\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.897010 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.930793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.998781 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-bound-sa-token\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.998883 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-certificates\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.998916 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a67e2f90-b7b3-4765-b944-d0e4f988fc54-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.998947 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-trusted-ca\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.998977 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a67e2f90-b7b3-4765-b944-d0e4f988fc54-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.999014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-tls\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:25 crc kubenswrapper[4922]: I0929 09:51:25.999095 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ttbz\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-kube-api-access-5ttbz\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.000632 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a67e2f90-b7b3-4765-b944-d0e4f988fc54-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.000950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-certificates\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.001990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67e2f90-b7b3-4765-b944-d0e4f988fc54-trusted-ca\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.007446 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a67e2f90-b7b3-4765-b944-d0e4f988fc54-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.007592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-registry-tls\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.020592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-bound-sa-token\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.027709 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ttbz\" (UniqueName: \"kubernetes.io/projected/a67e2f90-b7b3-4765-b944-d0e4f988fc54-kube-api-access-5ttbz\") pod \"image-registry-66df7c8f76-cc627\" (UID: \"a67e2f90-b7b3-4765-b944-d0e4f988fc54\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.075747 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.309936 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc627"] Sep 29 09:51:26 crc kubenswrapper[4922]: I0929 09:51:26.418215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" event={"ID":"a67e2f90-b7b3-4765-b944-d0e4f988fc54","Type":"ContainerStarted","Data":"21b667b7a143132c4c0e150d4b6aea246831b01b333f2899a374a656a2c40030"} Sep 29 09:51:27 crc kubenswrapper[4922]: I0929 09:51:27.426914 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" event={"ID":"a67e2f90-b7b3-4765-b944-d0e4f988fc54","Type":"ContainerStarted","Data":"1239d4a31c8f89b3503e86de0c7ce2e2a16829b52731d631da70218f52e9c6f2"} Sep 29 09:51:27 crc kubenswrapper[4922]: I0929 09:51:27.428022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:27 crc kubenswrapper[4922]: I0929 09:51:27.453643 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" podStartSLOduration=2.45362257 podStartE2EDuration="2.45362257s" podCreationTimestamp="2025-09-29 09:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:51:27.45143779 +0000 UTC m=+412.817668054" watchObservedRunningTime="2025-09-29 09:51:27.45362257 +0000 UTC m=+412.819852834" Sep 29 09:51:46 crc kubenswrapper[4922]: I0929 09:51:46.084240 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cc627" Sep 29 09:51:46 crc kubenswrapper[4922]: I0929 09:51:46.153645 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.197203 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" podUID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" containerName="registry" containerID="cri-o://5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2" gracePeriod=30 Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.559561 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613076 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613327 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613390 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613419 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613477 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkndf\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613510 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613538 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.613558 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets\") pod \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\" (UID: \"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc\") " Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.615053 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.615205 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.622155 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.623599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.623909 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf" (OuterVolumeSpecName: "kube-api-access-mkndf") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "kube-api-access-mkndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.631406 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.632387 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.635035 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" (UID: "b8af523a-fbfe-4e4a-9221-7f8a3a761ecc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.693679 4922 generic.go:334] "Generic (PLEG): container finished" podID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" containerID="5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2" exitCode=0 Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.693733 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" event={"ID":"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc","Type":"ContainerDied","Data":"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2"} Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.693768 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" event={"ID":"b8af523a-fbfe-4e4a-9221-7f8a3a761ecc","Type":"ContainerDied","Data":"d780246599051a76d7ccf7271eb47af0bf4dfcd1105b449e9921bcb0738730ab"} Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.693754 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hnjbq" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.693822 4922 scope.go:117] "RemoveContainer" containerID="5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715008 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715043 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715054 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715065 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkndf\" (UniqueName: \"kubernetes.io/projected/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-kube-api-access-mkndf\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715074 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715083 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.715092 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.717558 4922 scope.go:117] "RemoveContainer" containerID="5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2" Sep 29 09:52:11 crc kubenswrapper[4922]: E0929 09:52:11.718094 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2\": container with ID starting with 5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2 not found: ID does not exist" containerID="5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.718161 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2"} err="failed to get container status \"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2\": rpc error: code = NotFound desc = could not find container \"5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2\": container with ID starting with 5111ebd100acce9101ab36e7106e6007ab8aaf1e04da5d2f09616ec7342334c2 not found: ID does not exist" Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.731619 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:52:11 crc kubenswrapper[4922]: I0929 09:52:11.736410 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hnjbq"] Sep 29 09:52:13 crc kubenswrapper[4922]: I0929 09:52:13.458523 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" path="/var/lib/kubelet/pods/b8af523a-fbfe-4e4a-9221-7f8a3a761ecc/volumes" Sep 29 09:52:59 crc kubenswrapper[4922]: I0929 09:52:59.070910 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:52:59 crc kubenswrapper[4922]: I0929 09:52:59.071789 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:53:29 crc kubenswrapper[4922]: I0929 09:53:29.070664 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:53:29 crc kubenswrapper[4922]: I0929 09:53:29.071611 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.070743 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.071414 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.071475 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.072287 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.072349 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b" gracePeriod=600 Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.331480 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b" exitCode=0 Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.331548 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b"} Sep 29 09:53:59 crc kubenswrapper[4922]: I0929 09:53:59.332179 4922 scope.go:117] "RemoveContainer" containerID="cff6934a9169a4b4504fef15ca7f5cd9d69c634d61387892ad6e0193d51f4eb2" Sep 29 09:54:00 crc kubenswrapper[4922]: I0929 09:54:00.341567 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed"} Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.861221 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q9z92"] Sep 29 09:54:26 crc kubenswrapper[4922]: E0929 09:54:26.862342 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" containerName="registry" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.862362 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" containerName="registry" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.862499 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8af523a-fbfe-4e4a-9221-7f8a3a761ecc" containerName="registry" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.863006 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.869223 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-snmmn"] Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.869632 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.870073 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.870083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-snmmn" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.872683 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q9z92"] Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.873659 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wbmht" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.874871 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qmqt2" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.886564 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x5g9r"] Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.887464 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.892680 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5l7tb" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.900141 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-snmmn"] Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.906924 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x5g9r"] Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.983760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwpfc\" (UniqueName: \"kubernetes.io/projected/7626bb44-b67d-42eb-b912-a9b279f7157d-kube-api-access-bwpfc\") pod \"cert-manager-5b446d88c5-snmmn\" (UID: \"7626bb44-b67d-42eb-b912-a9b279f7157d\") " pod="cert-manager/cert-manager-5b446d88c5-snmmn" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.983823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zfn\" (UniqueName: \"kubernetes.io/projected/c9cbdf04-ea9d-4663-94c9-345fb63f3f9c-kube-api-access-l7zfn\") pod \"cert-manager-cainjector-7f985d654d-q9z92\" (UID: \"c9cbdf04-ea9d-4663-94c9-345fb63f3f9c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" Sep 29 09:54:26 crc kubenswrapper[4922]: I0929 09:54:26.984028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmxn\" (UniqueName: \"kubernetes.io/projected/93e3a2dc-f64b-4766-b851-2faa2e57c4f4-kube-api-access-gcmxn\") pod \"cert-manager-webhook-5655c58dd6-x5g9r\" (UID: \"93e3a2dc-f64b-4766-b851-2faa2e57c4f4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.085603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwpfc\" (UniqueName: \"kubernetes.io/projected/7626bb44-b67d-42eb-b912-a9b279f7157d-kube-api-access-bwpfc\") pod \"cert-manager-5b446d88c5-snmmn\" (UID: \"7626bb44-b67d-42eb-b912-a9b279f7157d\") " pod="cert-manager/cert-manager-5b446d88c5-snmmn" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.085673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zfn\" (UniqueName: \"kubernetes.io/projected/c9cbdf04-ea9d-4663-94c9-345fb63f3f9c-kube-api-access-l7zfn\") pod \"cert-manager-cainjector-7f985d654d-q9z92\" (UID: \"c9cbdf04-ea9d-4663-94c9-345fb63f3f9c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.085726 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmxn\" (UniqueName: \"kubernetes.io/projected/93e3a2dc-f64b-4766-b851-2faa2e57c4f4-kube-api-access-gcmxn\") pod \"cert-manager-webhook-5655c58dd6-x5g9r\" (UID: \"93e3a2dc-f64b-4766-b851-2faa2e57c4f4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.108015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmxn\" (UniqueName: \"kubernetes.io/projected/93e3a2dc-f64b-4766-b851-2faa2e57c4f4-kube-api-access-gcmxn\") pod \"cert-manager-webhook-5655c58dd6-x5g9r\" (UID: \"93e3a2dc-f64b-4766-b851-2faa2e57c4f4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.108371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zfn\" (UniqueName: \"kubernetes.io/projected/c9cbdf04-ea9d-4663-94c9-345fb63f3f9c-kube-api-access-l7zfn\") pod \"cert-manager-cainjector-7f985d654d-q9z92\" (UID: \"c9cbdf04-ea9d-4663-94c9-345fb63f3f9c\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.110322 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwpfc\" (UniqueName: \"kubernetes.io/projected/7626bb44-b67d-42eb-b912-a9b279f7157d-kube-api-access-bwpfc\") pod \"cert-manager-5b446d88c5-snmmn\" (UID: \"7626bb44-b67d-42eb-b912-a9b279f7157d\") " pod="cert-manager/cert-manager-5b446d88c5-snmmn" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.187957 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.199358 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-snmmn" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.372417 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.443399 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q9z92"] Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.448770 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.488345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" event={"ID":"c9cbdf04-ea9d-4663-94c9-345fb63f3f9c","Type":"ContainerStarted","Data":"7522b6f3a2d0a01227414244d3ff51d6db98ba27c5e27af3e2e9323d91cd66fa"} Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.496719 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-snmmn"] Sep 29 09:54:27 crc kubenswrapper[4922]: W0929 09:54:27.507710 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7626bb44_b67d_42eb_b912_a9b279f7157d.slice/crio-eab442cbb0de1e01c2eab1a9ddc7c5de6ae9a9f8e2a85271de0f756cde74d816 WatchSource:0}: Error finding container eab442cbb0de1e01c2eab1a9ddc7c5de6ae9a9f8e2a85271de0f756cde74d816: Status 404 returned error can't find the container with id eab442cbb0de1e01c2eab1a9ddc7c5de6ae9a9f8e2a85271de0f756cde74d816 Sep 29 09:54:27 crc kubenswrapper[4922]: I0929 09:54:27.592262 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x5g9r"] Sep 29 09:54:27 crc kubenswrapper[4922]: W0929 09:54:27.597985 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e3a2dc_f64b_4766_b851_2faa2e57c4f4.slice/crio-9d024159bee7f886271b5646d5d7f0b68894cecf7ea6166828d430329c22c719 WatchSource:0}: Error finding container 9d024159bee7f886271b5646d5d7f0b68894cecf7ea6166828d430329c22c719: Status 404 returned error can't find the container with id 9d024159bee7f886271b5646d5d7f0b68894cecf7ea6166828d430329c22c719 Sep 29 09:54:28 crc kubenswrapper[4922]: I0929 09:54:28.507067 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-snmmn" event={"ID":"7626bb44-b67d-42eb-b912-a9b279f7157d","Type":"ContainerStarted","Data":"eab442cbb0de1e01c2eab1a9ddc7c5de6ae9a9f8e2a85271de0f756cde74d816"} Sep 29 09:54:28 crc kubenswrapper[4922]: I0929 09:54:28.508563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" event={"ID":"93e3a2dc-f64b-4766-b851-2faa2e57c4f4","Type":"ContainerStarted","Data":"9d024159bee7f886271b5646d5d7f0b68894cecf7ea6166828d430329c22c719"} Sep 29 09:54:30 crc kubenswrapper[4922]: I0929 09:54:30.526986 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" event={"ID":"c9cbdf04-ea9d-4663-94c9-345fb63f3f9c","Type":"ContainerStarted","Data":"b1d5c38b8f3d3b884b8e2c6a25bcfa1e576c09603e22f0b5a8a1c2ad863fe50d"} Sep 29 09:54:30 crc kubenswrapper[4922]: I0929 09:54:30.545788 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-q9z92" podStartSLOduration=2.124868118 podStartE2EDuration="4.54577117s" podCreationTimestamp="2025-09-29 09:54:26 +0000 UTC" firstStartedPulling="2025-09-29 09:54:27.448476818 +0000 UTC m=+592.814707082" lastFinishedPulling="2025-09-29 09:54:29.86937987 +0000 UTC m=+595.235610134" observedRunningTime="2025-09-29 09:54:30.541874045 +0000 UTC m=+595.908104319" watchObservedRunningTime="2025-09-29 09:54:30.54577117 +0000 UTC m=+595.912001434" Sep 29 09:54:31 crc kubenswrapper[4922]: I0929 09:54:31.536655 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-snmmn" event={"ID":"7626bb44-b67d-42eb-b912-a9b279f7157d","Type":"ContainerStarted","Data":"91100e5a1de58eb29f5102b0b3b7207f678039544ed2fc089fe73932cc7bb7dc"} Sep 29 09:54:32 crc kubenswrapper[4922]: I0929 09:54:32.542891 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" event={"ID":"93e3a2dc-f64b-4766-b851-2faa2e57c4f4","Type":"ContainerStarted","Data":"243ce65333bc6bd52215be71db071d6f9ffd84b0fb5892a5f091eb3ca16a4015"} Sep 29 09:54:32 crc kubenswrapper[4922]: I0929 09:54:32.543393 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:32 crc kubenswrapper[4922]: I0929 09:54:32.562335 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" podStartSLOduration=2.775668937 podStartE2EDuration="6.56231037s" podCreationTimestamp="2025-09-29 09:54:26 +0000 UTC" firstStartedPulling="2025-09-29 09:54:27.599928337 +0000 UTC m=+592.966158601" lastFinishedPulling="2025-09-29 09:54:31.38656977 +0000 UTC m=+596.752800034" observedRunningTime="2025-09-29 09:54:32.558988141 +0000 UTC m=+597.925218395" watchObservedRunningTime="2025-09-29 09:54:32.56231037 +0000 UTC m=+597.928540634" Sep 29 09:54:32 crc kubenswrapper[4922]: I0929 09:54:32.582179 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-snmmn" podStartSLOduration=2.712141268 podStartE2EDuration="6.582157786s" podCreationTimestamp="2025-09-29 09:54:26 +0000 UTC" firstStartedPulling="2025-09-29 09:54:27.510483868 +0000 UTC m=+592.876714132" lastFinishedPulling="2025-09-29 09:54:31.380500386 +0000 UTC m=+596.746730650" observedRunningTime="2025-09-29 09:54:32.577056528 +0000 UTC m=+597.943286792" watchObservedRunningTime="2025-09-29 09:54:32.582157786 +0000 UTC m=+597.948388060" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.349686 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tr9bt"] Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351171 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-controller" containerID="cri-o://115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351271 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="nbdb" containerID="cri-o://2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351319 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351389 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="northd" containerID="cri-o://7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351508 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-node" containerID="cri-o://359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351418 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-acl-logging" containerID="cri-o://1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.351635 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="sbdb" containerID="cri-o://f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.382432 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-x5g9r" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.401670 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" containerID="cri-o://e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" gracePeriod=30 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.580819 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovnkube-controller/3.log" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.583152 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-acl-logging/0.log" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.583891 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-controller/0.log" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584332 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" exitCode=0 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584362 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" exitCode=0 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584370 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" exitCode=0 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584381 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" exitCode=143 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584391 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" exitCode=143 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584491 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584502 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.584523 4922 scope.go:117] "RemoveContainer" containerID="6b132b7d4fd73cf2c4d0b519359bc310626e6ebb3a5eb1565e00604d5d6fac05" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.587680 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/2.log" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.588132 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/1.log" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.588173 4922 generic.go:334] "Generic (PLEG): container finished" podID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" containerID="86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa" exitCode=2 Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.588206 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerDied","Data":"86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa"} Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.588683 4922 scope.go:117] "RemoveContainer" containerID="86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa" Sep 29 09:54:37 crc kubenswrapper[4922]: E0929 09:54:37.589255 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h6dfk_openshift-multus(7dc69012-4e4c-437b-82d8-9d04e2e22e58)\"" pod="openshift-multus/multus-h6dfk" podUID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" Sep 29 09:54:37 crc kubenswrapper[4922]: I0929 09:54:37.668790 4922 scope.go:117] "RemoveContainer" containerID="0fa3bb3cef651756b702afab94dd9527125ac32c9baf2911b948b231c2a1e273" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.102778 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-acl-logging/0.log" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.103801 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-controller/0.log" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.104340 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134208 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134238 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134256 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134291 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134312 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8rch\" (UniqueName: \"kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134307 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134290 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134340 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket" (OuterVolumeSpecName: "log-socket") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134343 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134307 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134346 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log" (OuterVolumeSpecName: "node-log") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134424 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134472 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash" (OuterVolumeSpecName: "host-slash") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134450 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134545 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134592 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134633 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134642 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134710 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134739 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134766 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134794 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134826 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134847 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134889 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134872 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134859 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134864 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.134930 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides\") pod \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\" (UID: \"ee08d9f2-f100-4598-8ab3-5198a21b08f0\") " Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135506 4922 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135523 4922 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-node-log\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135533 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135549 4922 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-slash\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135559 4922 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135569 4922 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135581 4922 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135592 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135603 4922 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135614 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135530 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135624 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135654 4922 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135664 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135675 4922 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135684 4922 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-log-socket\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.135676 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.142388 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.144288 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch" (OuterVolumeSpecName: "kube-api-access-p8rch") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "kube-api-access-p8rch". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.151405 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ee08d9f2-f100-4598-8ab3-5198a21b08f0" (UID: "ee08d9f2-f100-4598-8ab3-5198a21b08f0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.153906 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-svd6q"] Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156137 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-acl-logging" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156160 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-acl-logging" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156169 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kubecfg-setup" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156176 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kubecfg-setup" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156185 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="nbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156194 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="nbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156203 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-node" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156210 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-node" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156218 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156224 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156232 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156237 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156247 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="sbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156253 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="sbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156261 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156267 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156300 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156307 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156315 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="northd" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156321 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="northd" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156330 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156337 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156425 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156435 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="sbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156443 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="northd" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156450 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="nbdb" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156458 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156465 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovn-acl-logging" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156471 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156478 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156484 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="kube-rbac-proxy-node" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156493 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156499 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156590 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156597 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.156603 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156609 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.156686 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerName="ovnkube-controller" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.158220 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237017 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-netd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237067 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-var-lib-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237106 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-slash\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-systemd-units\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237321 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74n6\" (UniqueName: \"kubernetes.io/projected/556f8437-05c1-4061-9213-c868248f76fb-kube-api-access-r74n6\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237367 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-bin\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-kubelet\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237411 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-log-socket\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-netns\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237536 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-systemd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237568 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237602 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556f8437-05c1-4061-9213-c868248f76fb-ovn-node-metrics-cert\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237626 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-script-lib\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237644 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237675 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-node-log\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-ovn\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237726 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-config\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237747 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-env-overrides\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237766 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-etc-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237799 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237811 4922 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee08d9f2-f100-4598-8ab3-5198a21b08f0-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237825 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8rch\" (UniqueName: \"kubernetes.io/projected/ee08d9f2-f100-4598-8ab3-5198a21b08f0-kube-api-access-p8rch\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237855 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.237866 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee08d9f2-f100-4598-8ab3-5198a21b08f0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338848 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-systemd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338948 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556f8437-05c1-4061-9213-c868248f76fb-ovn-node-metrics-cert\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-script-lib\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338975 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-systemd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339034 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339053 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.338991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-node-log\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339152 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-ovn\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339177 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-config\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-env-overrides\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339221 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-node-log\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339257 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-etc-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-run-ovn\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-netd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-netd\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-var-lib-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339433 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339463 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-etc-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-slash\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339515 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-var-lib-openvswitch\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-ovn-kubernetes\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-systemd-units\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339585 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-slash\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339612 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74n6\" (UniqueName: \"kubernetes.io/projected/556f8437-05c1-4061-9213-c868248f76fb-kube-api-access-r74n6\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339654 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-bin\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339693 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-kubelet\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-log-socket\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339778 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-netns\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339940 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-run-netns\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-cni-bin\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.340013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-env-overrides\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.340020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-script-lib\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.340031 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-host-kubelet\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.339693 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-systemd-units\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.340074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/556f8437-05c1-4061-9213-c868248f76fb-log-socket\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.340191 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/556f8437-05c1-4061-9213-c868248f76fb-ovnkube-config\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.342956 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/556f8437-05c1-4061-9213-c868248f76fb-ovn-node-metrics-cert\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.354960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74n6\" (UniqueName: \"kubernetes.io/projected/556f8437-05c1-4061-9213-c868248f76fb-kube-api-access-r74n6\") pod \"ovnkube-node-svd6q\" (UID: \"556f8437-05c1-4061-9213-c868248f76fb\") " pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.474060 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:38 crc kubenswrapper[4922]: W0929 09:54:38.500586 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556f8437_05c1_4061_9213_c868248f76fb.slice/crio-5c250c7e74de1454e4e8fae5b7fc9fb927a0a3dbe8dd47c467eca1a0365b3d5b WatchSource:0}: Error finding container 5c250c7e74de1454e4e8fae5b7fc9fb927a0a3dbe8dd47c467eca1a0365b3d5b: Status 404 returned error can't find the container with id 5c250c7e74de1454e4e8fae5b7fc9fb927a0a3dbe8dd47c467eca1a0365b3d5b Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.600334 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-acl-logging/0.log" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.600867 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tr9bt_ee08d9f2-f100-4598-8ab3-5198a21b08f0/ovn-controller/0.log" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601207 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" exitCode=0 Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601260 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" exitCode=0 Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601304 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661"} Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274"} Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601356 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4"} Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601376 4922 scope.go:117] "RemoveContainer" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601453 4922 generic.go:334] "Generic (PLEG): container finished" podID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" exitCode=0 Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601522 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" event={"ID":"ee08d9f2-f100-4598-8ab3-5198a21b08f0","Type":"ContainerDied","Data":"b7010633445d722b5310860b71b80808292b33cf43f584358d3709d3b1ce4e9f"} Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.601735 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tr9bt" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.603933 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/2.log" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.605658 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"5c250c7e74de1454e4e8fae5b7fc9fb927a0a3dbe8dd47c467eca1a0365b3d5b"} Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.619631 4922 scope.go:117] "RemoveContainer" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.699783 4922 scope.go:117] "RemoveContainer" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.710961 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tr9bt"] Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.713431 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tr9bt"] Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.723050 4922 scope.go:117] "RemoveContainer" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.745855 4922 scope.go:117] "RemoveContainer" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.759064 4922 scope.go:117] "RemoveContainer" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.772203 4922 scope.go:117] "RemoveContainer" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.785554 4922 scope.go:117] "RemoveContainer" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.800961 4922 scope.go:117] "RemoveContainer" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.826707 4922 scope.go:117] "RemoveContainer" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.827173 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": container with ID starting with e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff not found: ID does not exist" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827200 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff"} err="failed to get container status \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": rpc error: code = NotFound desc = could not find container \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": container with ID starting with e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827218 4922 scope.go:117] "RemoveContainer" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.827442 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": container with ID starting with f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661 not found: ID does not exist" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827465 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661"} err="failed to get container status \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": rpc error: code = NotFound desc = could not find container \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": container with ID starting with f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827478 4922 scope.go:117] "RemoveContainer" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.827676 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": container with ID starting with 2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274 not found: ID does not exist" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827767 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274"} err="failed to get container status \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": rpc error: code = NotFound desc = could not find container \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": container with ID starting with 2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.827850 4922 scope.go:117] "RemoveContainer" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.828129 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": container with ID starting with 7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4 not found: ID does not exist" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.828148 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4"} err="failed to get container status \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": rpc error: code = NotFound desc = could not find container \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": container with ID starting with 7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.828160 4922 scope.go:117] "RemoveContainer" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.828580 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": container with ID starting with ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da not found: ID does not exist" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.828600 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da"} err="failed to get container status \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": rpc error: code = NotFound desc = could not find container \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": container with ID starting with ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.828615 4922 scope.go:117] "RemoveContainer" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.828875 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": container with ID starting with 359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096 not found: ID does not exist" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.828964 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096"} err="failed to get container status \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": rpc error: code = NotFound desc = could not find container \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": container with ID starting with 359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.829053 4922 scope.go:117] "RemoveContainer" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.829320 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": container with ID starting with 1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941 not found: ID does not exist" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.829357 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941"} err="failed to get container status \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": rpc error: code = NotFound desc = could not find container \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": container with ID starting with 1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.829371 4922 scope.go:117] "RemoveContainer" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.829602 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": container with ID starting with 115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16 not found: ID does not exist" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.829697 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16"} err="failed to get container status \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": rpc error: code = NotFound desc = could not find container \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": container with ID starting with 115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.829771 4922 scope.go:117] "RemoveContainer" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" Sep 29 09:54:38 crc kubenswrapper[4922]: E0929 09:54:38.830100 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": container with ID starting with 22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4 not found: ID does not exist" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.830125 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4"} err="failed to get container status \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": rpc error: code = NotFound desc = could not find container \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": container with ID starting with 22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.830138 4922 scope.go:117] "RemoveContainer" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.830334 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff"} err="failed to get container status \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": rpc error: code = NotFound desc = could not find container \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": container with ID starting with e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.830422 4922 scope.go:117] "RemoveContainer" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.830921 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661"} err="failed to get container status \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": rpc error: code = NotFound desc = could not find container \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": container with ID starting with f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.831009 4922 scope.go:117] "RemoveContainer" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.831337 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274"} err="failed to get container status \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": rpc error: code = NotFound desc = could not find container \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": container with ID starting with 2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.831453 4922 scope.go:117] "RemoveContainer" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.831805 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4"} err="failed to get container status \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": rpc error: code = NotFound desc = could not find container \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": container with ID starting with 7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.831860 4922 scope.go:117] "RemoveContainer" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832234 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da"} err="failed to get container status \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": rpc error: code = NotFound desc = could not find container \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": container with ID starting with ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832256 4922 scope.go:117] "RemoveContainer" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832573 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096"} err="failed to get container status \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": rpc error: code = NotFound desc = could not find container \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": container with ID starting with 359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832600 4922 scope.go:117] "RemoveContainer" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832882 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941"} err="failed to get container status \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": rpc error: code = NotFound desc = could not find container \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": container with ID starting with 1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.832979 4922 scope.go:117] "RemoveContainer" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.833384 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16"} err="failed to get container status \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": rpc error: code = NotFound desc = could not find container \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": container with ID starting with 115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.833467 4922 scope.go:117] "RemoveContainer" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.833773 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4"} err="failed to get container status \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": rpc error: code = NotFound desc = could not find container \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": container with ID starting with 22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.833800 4922 scope.go:117] "RemoveContainer" containerID="e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834099 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff"} err="failed to get container status \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": rpc error: code = NotFound desc = could not find container \"e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff\": container with ID starting with e9f106a1b90992b5bcc93870a3833b09c52542140e5c4b480eace7c59abea1ff not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834126 4922 scope.go:117] "RemoveContainer" containerID="f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834387 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661"} err="failed to get container status \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": rpc error: code = NotFound desc = could not find container \"f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661\": container with ID starting with f179a9deecb0c7d464f1d3d510d480f59f9c0eb9e931de44aabfc66b9c393661 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834414 4922 scope.go:117] "RemoveContainer" containerID="2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834610 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274"} err="failed to get container status \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": rpc error: code = NotFound desc = could not find container \"2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274\": container with ID starting with 2df1f2abd898b9ce4a0a6c3679f0513ad8753aa88cb22a4e314a6986583c4274 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834636 4922 scope.go:117] "RemoveContainer" containerID="7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834914 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4"} err="failed to get container status \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": rpc error: code = NotFound desc = could not find container \"7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4\": container with ID starting with 7eb9d8825eeeba184807be79dea26d7eae3b32dbb4c8f53aaefb55bc8b623ee4 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.834938 4922 scope.go:117] "RemoveContainer" containerID="ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.835296 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da"} err="failed to get container status \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": rpc error: code = NotFound desc = could not find container \"ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da\": container with ID starting with ab4b8a9010f89f46eaf933f5b9c0d76f5f6d78e20810e665494d9daf35ab73da not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.835384 4922 scope.go:117] "RemoveContainer" containerID="359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.835706 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096"} err="failed to get container status \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": rpc error: code = NotFound desc = could not find container \"359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096\": container with ID starting with 359e5a8a207620c82024285891dec428cada5d3eebe3a553bd7131eeb3f0b096 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.835736 4922 scope.go:117] "RemoveContainer" containerID="1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.836024 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941"} err="failed to get container status \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": rpc error: code = NotFound desc = could not find container \"1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941\": container with ID starting with 1ae3931db423f4a46319f85bdd389b524cfbc22e12e50befdabe8dd44e8ae941 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.836046 4922 scope.go:117] "RemoveContainer" containerID="115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.836533 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16"} err="failed to get container status \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": rpc error: code = NotFound desc = could not find container \"115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16\": container with ID starting with 115ffb2e25db2e4231a323fad641eed3f67fa9ee94040bff028c2f8ca3efad16 not found: ID does not exist" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.836570 4922 scope.go:117] "RemoveContainer" containerID="22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4" Sep 29 09:54:38 crc kubenswrapper[4922]: I0929 09:54:38.836845 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4"} err="failed to get container status \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": rpc error: code = NotFound desc = could not find container \"22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4\": container with ID starting with 22965724b3b41f11c7cfbad0c4937ac53550552d1ef67e68a9f0bee5c90e29c4 not found: ID does not exist" Sep 29 09:54:39 crc kubenswrapper[4922]: I0929 09:54:39.469080 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee08d9f2-f100-4598-8ab3-5198a21b08f0" path="/var/lib/kubelet/pods/ee08d9f2-f100-4598-8ab3-5198a21b08f0/volumes" Sep 29 09:54:39 crc kubenswrapper[4922]: I0929 09:54:39.613586 4922 generic.go:334] "Generic (PLEG): container finished" podID="556f8437-05c1-4061-9213-c868248f76fb" containerID="c24b37932b7cd7eb09e621784468706e180a126164a5eb89701ae859b627450a" exitCode=0 Sep 29 09:54:39 crc kubenswrapper[4922]: I0929 09:54:39.614046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerDied","Data":"c24b37932b7cd7eb09e621784468706e180a126164a5eb89701ae859b627450a"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627013 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"7ad4fa66457fdd5e08013ffb0fe267199e2d394a58fdc950790c5b8d9f4e66cc"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"5e1816aa93fc3f80c9ef7c67f18474c4bf3f00cb6db0a9c3f429ff4237431d5e"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"88c988863114b345a8cb4c841ee8bd32d2fc8014328132a7003381238503fea3"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627808 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"8d8534445a82c94be30c302ee75b9e4bf28c878da46bb5f22f3d15ff20133315"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627841 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"f670c7885cd14f84af451cc7d3bb5c4dca4fcd87192094e7e03458479338b968"} Sep 29 09:54:40 crc kubenswrapper[4922]: I0929 09:54:40.627855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"1ac517ff20c4deffeb9ba738dbeea911504776f7019f2b7dba83e050aa848102"} Sep 29 09:54:43 crc kubenswrapper[4922]: I0929 09:54:43.645243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"2b7d2461d505563b25c8ed66c69c072dc6df6b38387699da309c2aba7b6c1e2a"} Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.658438 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" event={"ID":"556f8437-05c1-4061-9213-c868248f76fb","Type":"ContainerStarted","Data":"792cdea3dd3a1a846de922473c8869f66e5c059b9549ffd0531b6c052ce7ea26"} Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.659057 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.659071 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.659082 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.687199 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.688624 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:54:45 crc kubenswrapper[4922]: I0929 09:54:45.695006 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" podStartSLOduration=7.69498679 podStartE2EDuration="7.69498679s" podCreationTimestamp="2025-09-29 09:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:54:45.691499857 +0000 UTC m=+611.057730131" watchObservedRunningTime="2025-09-29 09:54:45.69498679 +0000 UTC m=+611.061217054" Sep 29 09:54:49 crc kubenswrapper[4922]: I0929 09:54:49.452218 4922 scope.go:117] "RemoveContainer" containerID="86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa" Sep 29 09:54:49 crc kubenswrapper[4922]: E0929 09:54:49.452904 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h6dfk_openshift-multus(7dc69012-4e4c-437b-82d8-9d04e2e22e58)\"" pod="openshift-multus/multus-h6dfk" podUID="7dc69012-4e4c-437b-82d8-9d04e2e22e58" Sep 29 09:55:01 crc kubenswrapper[4922]: I0929 09:55:01.452479 4922 scope.go:117] "RemoveContainer" containerID="86905bae7fc76fddcc8a482538e2e8667cfae303320d10f72f7a4053c8b9aefa" Sep 29 09:55:01 crc kubenswrapper[4922]: I0929 09:55:01.758553 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6dfk_7dc69012-4e4c-437b-82d8-9d04e2e22e58/kube-multus/2.log" Sep 29 09:55:01 crc kubenswrapper[4922]: I0929 09:55:01.759210 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6dfk" event={"ID":"7dc69012-4e4c-437b-82d8-9d04e2e22e58","Type":"ContainerStarted","Data":"ed1e76fda379b81cbff385d3066b75ba7f944da1222dc542960d74ae03381563"} Sep 29 09:55:08 crc kubenswrapper[4922]: I0929 09:55:08.499388 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svd6q" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.729953 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm"] Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.732064 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.733987 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.744571 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm"] Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.865476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gcm\" (UniqueName: \"kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.865523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.865552 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.966805 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gcm\" (UniqueName: \"kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.966888 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.966912 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.967529 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.967578 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:15 crc kubenswrapper[4922]: I0929 09:55:15.990020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gcm\" (UniqueName: \"kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:16 crc kubenswrapper[4922]: I0929 09:55:16.055748 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:16 crc kubenswrapper[4922]: I0929 09:55:16.235112 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm"] Sep 29 09:55:16 crc kubenswrapper[4922]: I0929 09:55:16.839710 4922 generic.go:334] "Generic (PLEG): container finished" podID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerID="33efb60b014fdda4b98e96e4efe30320a636ee21cdea195758698f88d686e96f" exitCode=0 Sep 29 09:55:16 crc kubenswrapper[4922]: I0929 09:55:16.839795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" event={"ID":"814fb3a2-10f7-4136-8745-0caf3cc5dac8","Type":"ContainerDied","Data":"33efb60b014fdda4b98e96e4efe30320a636ee21cdea195758698f88d686e96f"} Sep 29 09:55:16 crc kubenswrapper[4922]: I0929 09:55:16.841003 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" event={"ID":"814fb3a2-10f7-4136-8745-0caf3cc5dac8","Type":"ContainerStarted","Data":"9011a25c93d0d7dd6ce13d1b8655f6dbc076b5d2a5e9280fac3080f245a6c481"} Sep 29 09:55:18 crc kubenswrapper[4922]: I0929 09:55:18.856603 4922 generic.go:334] "Generic (PLEG): container finished" podID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerID="e328d4c14b16c33a5b41b677b17cc77f2b20206b08c3aa3171b40f08a5a13608" exitCode=0 Sep 29 09:55:18 crc kubenswrapper[4922]: I0929 09:55:18.856752 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" event={"ID":"814fb3a2-10f7-4136-8745-0caf3cc5dac8","Type":"ContainerDied","Data":"e328d4c14b16c33a5b41b677b17cc77f2b20206b08c3aa3171b40f08a5a13608"} Sep 29 09:55:19 crc kubenswrapper[4922]: I0929 09:55:19.864559 4922 generic.go:334] "Generic (PLEG): container finished" podID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerID="4fe44b07718f3f49552d66422930ffab0f8ba0e8b00bae709a14d9c1b87c0514" exitCode=0 Sep 29 09:55:19 crc kubenswrapper[4922]: I0929 09:55:19.864648 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" event={"ID":"814fb3a2-10f7-4136-8745-0caf3cc5dac8","Type":"ContainerDied","Data":"4fe44b07718f3f49552d66422930ffab0f8ba0e8b00bae709a14d9c1b87c0514"} Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.092250 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.141166 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle\") pod \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.141214 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util\") pod \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.141331 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gcm\" (UniqueName: \"kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm\") pod \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\" (UID: \"814fb3a2-10f7-4136-8745-0caf3cc5dac8\") " Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.141957 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle" (OuterVolumeSpecName: "bundle") pod "814fb3a2-10f7-4136-8745-0caf3cc5dac8" (UID: "814fb3a2-10f7-4136-8745-0caf3cc5dac8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.147193 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm" (OuterVolumeSpecName: "kube-api-access-v9gcm") pod "814fb3a2-10f7-4136-8745-0caf3cc5dac8" (UID: "814fb3a2-10f7-4136-8745-0caf3cc5dac8"). InnerVolumeSpecName "kube-api-access-v9gcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.154406 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util" (OuterVolumeSpecName: "util") pod "814fb3a2-10f7-4136-8745-0caf3cc5dac8" (UID: "814fb3a2-10f7-4136-8745-0caf3cc5dac8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.242546 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gcm\" (UniqueName: \"kubernetes.io/projected/814fb3a2-10f7-4136-8745-0caf3cc5dac8-kube-api-access-v9gcm\") on node \"crc\" DevicePath \"\"" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.242586 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.242599 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814fb3a2-10f7-4136-8745-0caf3cc5dac8-util\") on node \"crc\" DevicePath \"\"" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.876916 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" event={"ID":"814fb3a2-10f7-4136-8745-0caf3cc5dac8","Type":"ContainerDied","Data":"9011a25c93d0d7dd6ce13d1b8655f6dbc076b5d2a5e9280fac3080f245a6c481"} Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.876964 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm" Sep 29 09:55:21 crc kubenswrapper[4922]: I0929 09:55:21.876969 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9011a25c93d0d7dd6ce13d1b8655f6dbc076b5d2a5e9280fac3080f245a6c481" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.685593 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq"] Sep 29 09:55:23 crc kubenswrapper[4922]: E0929 09:55:23.686309 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="extract" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.686323 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="extract" Sep 29 09:55:23 crc kubenswrapper[4922]: E0929 09:55:23.686335 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="util" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.686341 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="util" Sep 29 09:55:23 crc kubenswrapper[4922]: E0929 09:55:23.686355 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="pull" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.686378 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="pull" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.686503 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="814fb3a2-10f7-4136-8745-0caf3cc5dac8" containerName="extract" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.687036 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.689423 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-59zmg" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.690959 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.691960 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.706183 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq"] Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.774663 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcqch\" (UniqueName: \"kubernetes.io/projected/485426b6-cad6-4591-beaf-d8bb33f79ea1-kube-api-access-gcqch\") pod \"nmstate-operator-5d6f6cfd66-hrqwq\" (UID: \"485426b6-cad6-4591-beaf-d8bb33f79ea1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.877018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcqch\" (UniqueName: \"kubernetes.io/projected/485426b6-cad6-4591-beaf-d8bb33f79ea1-kube-api-access-gcqch\") pod \"nmstate-operator-5d6f6cfd66-hrqwq\" (UID: \"485426b6-cad6-4591-beaf-d8bb33f79ea1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" Sep 29 09:55:23 crc kubenswrapper[4922]: I0929 09:55:23.902738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcqch\" (UniqueName: \"kubernetes.io/projected/485426b6-cad6-4591-beaf-d8bb33f79ea1-kube-api-access-gcqch\") pod \"nmstate-operator-5d6f6cfd66-hrqwq\" (UID: \"485426b6-cad6-4591-beaf-d8bb33f79ea1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" Sep 29 09:55:24 crc kubenswrapper[4922]: I0929 09:55:24.005715 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" Sep 29 09:55:24 crc kubenswrapper[4922]: I0929 09:55:24.247423 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq"] Sep 29 09:55:24 crc kubenswrapper[4922]: I0929 09:55:24.900168 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" event={"ID":"485426b6-cad6-4591-beaf-d8bb33f79ea1","Type":"ContainerStarted","Data":"45b23b16dbba26d9b6f0df5d5cf05bc0282ba255f47d8fff3eeb624bef5d3941"} Sep 29 09:55:26 crc kubenswrapper[4922]: I0929 09:55:26.931128 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" event={"ID":"485426b6-cad6-4591-beaf-d8bb33f79ea1","Type":"ContainerStarted","Data":"c1314f3652afdecc8809a57df340f54f95d0d625d00416bdc43d7d087a819d1b"} Sep 29 09:55:26 crc kubenswrapper[4922]: I0929 09:55:26.948219 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-hrqwq" podStartSLOduration=1.512412324 podStartE2EDuration="3.948196135s" podCreationTimestamp="2025-09-29 09:55:23 +0000 UTC" firstStartedPulling="2025-09-29 09:55:24.26738728 +0000 UTC m=+649.633617544" lastFinishedPulling="2025-09-29 09:55:26.703171091 +0000 UTC m=+652.069401355" observedRunningTime="2025-09-29 09:55:26.945636437 +0000 UTC m=+652.311866711" watchObservedRunningTime="2025-09-29 09:55:26.948196135 +0000 UTC m=+652.314426399" Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.934952 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj"] Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.936353 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.941549 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wq75l" Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.943301 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-99cmf"] Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.945610 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.949193 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.954601 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj"] Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.969452 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-99cmf"] Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.973550 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sp9q9"] Sep 29 09:55:27 crc kubenswrapper[4922]: I0929 09:55:27.974781 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042544 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzjp\" (UniqueName: \"kubernetes.io/projected/bf8e5bd0-e08e-4818-843b-30f7c956626f-kube-api-access-rnzjp\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042634 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-ovs-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-nmstate-lock\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042680 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-dbus-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042763 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gchc\" (UniqueName: \"kubernetes.io/projected/430f697e-6b89-4db1-91a8-194c8a7af724-kube-api-access-6gchc\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.042783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrt9z\" (UniqueName: \"kubernetes.io/projected/8fbc50c8-5afc-4ad5-888b-167e84fa22d0-kube-api-access-nrt9z\") pod \"nmstate-metrics-58fcddf996-5cfzj\" (UID: \"8fbc50c8-5afc-4ad5-888b-167e84fa22d0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.068918 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq"] Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.069903 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.072240 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.073042 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.073329 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m79f8" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.078619 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq"] Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144234 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbzd\" (UniqueName: \"kubernetes.io/projected/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-kube-api-access-gxbzd\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144262 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gchc\" (UniqueName: \"kubernetes.io/projected/430f697e-6b89-4db1-91a8-194c8a7af724-kube-api-access-6gchc\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144306 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrt9z\" (UniqueName: \"kubernetes.io/projected/8fbc50c8-5afc-4ad5-888b-167e84fa22d0-kube-api-access-nrt9z\") pod \"nmstate-metrics-58fcddf996-5cfzj\" (UID: \"8fbc50c8-5afc-4ad5-888b-167e84fa22d0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzjp\" (UniqueName: \"kubernetes.io/projected/bf8e5bd0-e08e-4818-843b-30f7c956626f-kube-api-access-rnzjp\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-ovs-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144384 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-nmstate-lock\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144410 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-dbus-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-ovs-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144580 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-nmstate-lock\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: E0929 09:55:28.144409 4922 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 29 09:55:28 crc kubenswrapper[4922]: E0929 09:55:28.145019 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair podName:430f697e-6b89-4db1-91a8-194c8a7af724 nodeName:}" failed. No retries permitted until 2025-09-29 09:55:28.644998555 +0000 UTC m=+654.011228879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair") pod "nmstate-webhook-6d689559c5-99cmf" (UID: "430f697e-6b89-4db1-91a8-194c8a7af724") : secret "openshift-nmstate-webhook" not found Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.144961 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/bf8e5bd0-e08e-4818-843b-30f7c956626f-dbus-socket\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.166764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrt9z\" (UniqueName: \"kubernetes.io/projected/8fbc50c8-5afc-4ad5-888b-167e84fa22d0-kube-api-access-nrt9z\") pod \"nmstate-metrics-58fcddf996-5cfzj\" (UID: \"8fbc50c8-5afc-4ad5-888b-167e84fa22d0\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.169227 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gchc\" (UniqueName: \"kubernetes.io/projected/430f697e-6b89-4db1-91a8-194c8a7af724-kube-api-access-6gchc\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.169332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzjp\" (UniqueName: \"kubernetes.io/projected/bf8e5bd0-e08e-4818-843b-30f7c956626f-kube-api-access-rnzjp\") pod \"nmstate-handler-sp9q9\" (UID: \"bf8e5bd0-e08e-4818-843b-30f7c956626f\") " pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.245522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbzd\" (UniqueName: \"kubernetes.io/projected/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-kube-api-access-gxbzd\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.245646 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.245690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: E0929 09:55:28.245842 4922 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 29 09:55:28 crc kubenswrapper[4922]: E0929 09:55:28.245902 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert podName:bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72 nodeName:}" failed. No retries permitted until 2025-09-29 09:55:28.745883388 +0000 UTC m=+654.112113652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-gc2pq" (UID: "bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72") : secret "plugin-serving-cert" not found Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.247774 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.259520 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.293803 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.294994 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69487d6487-t22lp"] Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.295730 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.310082 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbzd\" (UniqueName: \"kubernetes.io/projected/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-kube-api-access-gxbzd\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.329521 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69487d6487-t22lp"] Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.348593 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-oauth-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrhs\" (UniqueName: \"kubernetes.io/projected/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-kube-api-access-wwrhs\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372090 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-oauth-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372182 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372214 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-service-ca\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.372248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-trusted-ca-bundle\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: W0929 09:55:28.400339 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8e5bd0_e08e_4818_843b_30f7c956626f.slice/crio-d7a435e34e531fe40bea0d74ee49db86c281f2bbb91a6590bf2f3e4c5096cda7 WatchSource:0}: Error finding container d7a435e34e531fe40bea0d74ee49db86c281f2bbb91a6590bf2f3e4c5096cda7: Status 404 returned error can't find the container with id d7a435e34e531fe40bea0d74ee49db86c281f2bbb91a6590bf2f3e4c5096cda7 Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473608 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrhs\" (UniqueName: \"kubernetes.io/projected/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-kube-api-access-wwrhs\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473658 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-oauth-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-service-ca\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473745 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-trusted-ca-bundle\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.473805 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-oauth-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.474676 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.475279 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-oauth-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.475282 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-service-ca\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.476015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-trusted-ca-bundle\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.477557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-serving-cert\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.478794 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-console-oauth-config\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.514908 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrhs\" (UniqueName: \"kubernetes.io/projected/4caa04cb-92c3-4782-8e3e-8a5ec2a762b3-kube-api-access-wwrhs\") pod \"console-69487d6487-t22lp\" (UID: \"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3\") " pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.604277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj"] Sep 29 09:55:28 crc kubenswrapper[4922]: W0929 09:55:28.611422 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbc50c8_5afc_4ad5_888b_167e84fa22d0.slice/crio-8cfb1093a0684ecf47c24808be2010547ef6f740e71f41a6cb4994fd92b14485 WatchSource:0}: Error finding container 8cfb1093a0684ecf47c24808be2010547ef6f740e71f41a6cb4994fd92b14485: Status 404 returned error can't find the container with id 8cfb1093a0684ecf47c24808be2010547ef6f740e71f41a6cb4994fd92b14485 Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.675927 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.680750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430f697e-6b89-4db1-91a8-194c8a7af724-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-99cmf\" (UID: \"430f697e-6b89-4db1-91a8-194c8a7af724\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.685086 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.777442 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.782702 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-gc2pq\" (UID: \"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.870367 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.984769 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.985183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sp9q9" event={"ID":"bf8e5bd0-e08e-4818-843b-30f7c956626f","Type":"ContainerStarted","Data":"d7a435e34e531fe40bea0d74ee49db86c281f2bbb91a6590bf2f3e4c5096cda7"} Sep 29 09:55:28 crc kubenswrapper[4922]: I0929 09:55:28.986490 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" event={"ID":"8fbc50c8-5afc-4ad5-888b-167e84fa22d0","Type":"ContainerStarted","Data":"8cfb1093a0684ecf47c24808be2010547ef6f740e71f41a6cb4994fd92b14485"} Sep 29 09:55:29 crc kubenswrapper[4922]: I0929 09:55:29.095042 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-99cmf"] Sep 29 09:55:29 crc kubenswrapper[4922]: W0929 09:55:29.097582 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430f697e_6b89_4db1_91a8_194c8a7af724.slice/crio-a444d87e31ffe0b8a3910f735eddc2b20cc0fb4f6597565aed76169fc509d19f WatchSource:0}: Error finding container a444d87e31ffe0b8a3910f735eddc2b20cc0fb4f6597565aed76169fc509d19f: Status 404 returned error can't find the container with id a444d87e31ffe0b8a3910f735eddc2b20cc0fb4f6597565aed76169fc509d19f Sep 29 09:55:29 crc kubenswrapper[4922]: I0929 09:55:29.127541 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69487d6487-t22lp"] Sep 29 09:55:29 crc kubenswrapper[4922]: W0929 09:55:29.154950 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4caa04cb_92c3_4782_8e3e_8a5ec2a762b3.slice/crio-afdc1bb6963003ae58923fa31e330791348162927ada40534956eae5beaa19aa WatchSource:0}: Error finding container afdc1bb6963003ae58923fa31e330791348162927ada40534956eae5beaa19aa: Status 404 returned error can't find the container with id afdc1bb6963003ae58923fa31e330791348162927ada40534956eae5beaa19aa Sep 29 09:55:29 crc kubenswrapper[4922]: I0929 09:55:29.201024 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq"] Sep 29 09:55:29 crc kubenswrapper[4922]: W0929 09:55:29.209744 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb21dd2_5cc5_49e5_a2df_e47c1a29dc72.slice/crio-8efde14635ff45c6d5fb64bccb143cc2665b07c407206d15fae3bfadf5c415c6 WatchSource:0}: Error finding container 8efde14635ff45c6d5fb64bccb143cc2665b07c407206d15fae3bfadf5c415c6: Status 404 returned error can't find the container with id 8efde14635ff45c6d5fb64bccb143cc2665b07c407206d15fae3bfadf5c415c6 Sep 29 09:55:29 crc kubenswrapper[4922]: I0929 09:55:29.997764 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" event={"ID":"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72","Type":"ContainerStarted","Data":"8efde14635ff45c6d5fb64bccb143cc2665b07c407206d15fae3bfadf5c415c6"} Sep 29 09:55:30 crc kubenswrapper[4922]: I0929 09:55:30.001484 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" event={"ID":"430f697e-6b89-4db1-91a8-194c8a7af724","Type":"ContainerStarted","Data":"a444d87e31ffe0b8a3910f735eddc2b20cc0fb4f6597565aed76169fc509d19f"} Sep 29 09:55:30 crc kubenswrapper[4922]: I0929 09:55:30.003852 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69487d6487-t22lp" event={"ID":"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3","Type":"ContainerStarted","Data":"8e213a0f1334cca6ecf004edfb87d414353f3d5cc4de63e69a9cf4b0cef7cc23"} Sep 29 09:55:30 crc kubenswrapper[4922]: I0929 09:55:30.003895 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69487d6487-t22lp" event={"ID":"4caa04cb-92c3-4782-8e3e-8a5ec2a762b3","Type":"ContainerStarted","Data":"afdc1bb6963003ae58923fa31e330791348162927ada40534956eae5beaa19aa"} Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.036852 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" event={"ID":"430f697e-6b89-4db1-91a8-194c8a7af724","Type":"ContainerStarted","Data":"c844e48c62cee180cea58d841acf64eeecbf845d2a88724a296e34027d54a1bf"} Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.038270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" event={"ID":"8fbc50c8-5afc-4ad5-888b-167e84fa22d0","Type":"ContainerStarted","Data":"4bd9e8ea20bf1a6f29c0a38bce9b18de6a85e6417db4f2baf15aea75d03d0f46"} Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.039664 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sp9q9" event={"ID":"bf8e5bd0-e08e-4818-843b-30f7c956626f","Type":"ContainerStarted","Data":"d8c8ed1b27ac6beef5106a2734008658d77e64f68035d4b9afa8c7697db457b3"} Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.040609 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.060372 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69487d6487-t22lp" podStartSLOduration=5.06035145 podStartE2EDuration="5.06035145s" podCreationTimestamp="2025-09-29 09:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:55:30.027682132 +0000 UTC m=+655.393912406" watchObservedRunningTime="2025-09-29 09:55:33.06035145 +0000 UTC m=+658.426581714" Sep 29 09:55:33 crc kubenswrapper[4922]: I0929 09:55:33.063065 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sp9q9" podStartSLOduration=2.346510801 podStartE2EDuration="6.063056671s" podCreationTimestamp="2025-09-29 09:55:27 +0000 UTC" firstStartedPulling="2025-09-29 09:55:28.410474136 +0000 UTC m=+653.776704400" lastFinishedPulling="2025-09-29 09:55:32.127020006 +0000 UTC m=+657.493250270" observedRunningTime="2025-09-29 09:55:33.058058199 +0000 UTC m=+658.424288463" watchObservedRunningTime="2025-09-29 09:55:33.063056671 +0000 UTC m=+658.429286935" Sep 29 09:55:34 crc kubenswrapper[4922]: I0929 09:55:34.060428 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" event={"ID":"bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72","Type":"ContainerStarted","Data":"1e540c733537dc99cfb0f34f777c5143687f2a19857618c9f02845d3b66bbe3e"} Sep 29 09:55:34 crc kubenswrapper[4922]: I0929 09:55:34.060999 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:34 crc kubenswrapper[4922]: I0929 09:55:34.085411 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-gc2pq" podStartSLOduration=2.288058319 podStartE2EDuration="6.085391214s" podCreationTimestamp="2025-09-29 09:55:28 +0000 UTC" firstStartedPulling="2025-09-29 09:55:29.213195884 +0000 UTC m=+654.579426148" lastFinishedPulling="2025-09-29 09:55:33.010528779 +0000 UTC m=+658.376759043" observedRunningTime="2025-09-29 09:55:34.082099738 +0000 UTC m=+659.448330002" watchObservedRunningTime="2025-09-29 09:55:34.085391214 +0000 UTC m=+659.451621478" Sep 29 09:55:34 crc kubenswrapper[4922]: I0929 09:55:34.103195 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" podStartSLOduration=3.349356532 podStartE2EDuration="7.103172632s" podCreationTimestamp="2025-09-29 09:55:27 +0000 UTC" firstStartedPulling="2025-09-29 09:55:29.10691945 +0000 UTC m=+654.473149724" lastFinishedPulling="2025-09-29 09:55:32.86073556 +0000 UTC m=+658.226965824" observedRunningTime="2025-09-29 09:55:34.098943921 +0000 UTC m=+659.465174185" watchObservedRunningTime="2025-09-29 09:55:34.103172632 +0000 UTC m=+659.469402896" Sep 29 09:55:35 crc kubenswrapper[4922]: I0929 09:55:35.066995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" event={"ID":"8fbc50c8-5afc-4ad5-888b-167e84fa22d0","Type":"ContainerStarted","Data":"ba7293ddc8a341e40db55c59eaa303e37d90834ba7abe8a0af115307b457e9e0"} Sep 29 09:55:35 crc kubenswrapper[4922]: I0929 09:55:35.089447 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5cfzj" podStartSLOduration=2.158990238 podStartE2EDuration="8.089420466s" podCreationTimestamp="2025-09-29 09:55:27 +0000 UTC" firstStartedPulling="2025-09-29 09:55:28.614147091 +0000 UTC m=+653.980377355" lastFinishedPulling="2025-09-29 09:55:34.544577319 +0000 UTC m=+659.910807583" observedRunningTime="2025-09-29 09:55:35.084348142 +0000 UTC m=+660.450578416" watchObservedRunningTime="2025-09-29 09:55:35.089420466 +0000 UTC m=+660.455650730" Sep 29 09:55:38 crc kubenswrapper[4922]: I0929 09:55:38.320762 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sp9q9" Sep 29 09:55:38 crc kubenswrapper[4922]: I0929 09:55:38.685743 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:38 crc kubenswrapper[4922]: I0929 09:55:38.685813 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:38 crc kubenswrapper[4922]: I0929 09:55:38.690716 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:39 crc kubenswrapper[4922]: I0929 09:55:39.107074 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69487d6487-t22lp" Sep 29 09:55:39 crc kubenswrapper[4922]: I0929 09:55:39.162735 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:55:48 crc kubenswrapper[4922]: I0929 09:55:48.880070 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-99cmf" Sep 29 09:55:59 crc kubenswrapper[4922]: I0929 09:55:59.070715 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:55:59 crc kubenswrapper[4922]: I0929 09:55:59.071982 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.786219 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9"] Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.790919 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.793661 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.802638 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9"] Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.898163 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.898232 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:02 crc kubenswrapper[4922]: I0929 09:56:02.898280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2v5\" (UniqueName: \"kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.000096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.000190 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.000244 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2v5\" (UniqueName: \"kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.001262 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.001261 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.021762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2v5\" (UniqueName: \"kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.113262 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:03 crc kubenswrapper[4922]: I0929 09:56:03.330478 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9"] Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.210586 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4zgtm" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" containerID="cri-o://ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9" gracePeriod=15 Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.284508 4922 generic.go:334] "Generic (PLEG): container finished" podID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerID="97d956d0fe22efb14d45dd71550d522dd4de00ec5272019141f2061c0dffdb79" exitCode=0 Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.284571 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" event={"ID":"abce2942-2d7c-4097-992b-3ca6aabdc6f1","Type":"ContainerDied","Data":"97d956d0fe22efb14d45dd71550d522dd4de00ec5272019141f2061c0dffdb79"} Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.284608 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" event={"ID":"abce2942-2d7c-4097-992b-3ca6aabdc6f1","Type":"ContainerStarted","Data":"003b842cc505f88cc2f78cc3641c4b9f0beb17fdc5c79b51b8bb868642e881a8"} Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.599884 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4zgtm_48e2c6f9-1502-4fa6-854d-ef25455dadb1/console/0.log" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.600515 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.728806 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.728950 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.729048 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.729106 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730010 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sd8m\" (UniqueName: \"kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730168 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert\") pod \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\" (UID: \"48e2c6f9-1502-4fa6-854d-ef25455dadb1\") " Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730279 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca" (OuterVolumeSpecName: "service-ca") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config" (OuterVolumeSpecName: "console-config") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730745 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730764 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730937 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.730994 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.737982 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.738783 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.738942 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m" (OuterVolumeSpecName: "kube-api-access-4sd8m") pod "48e2c6f9-1502-4fa6-854d-ef25455dadb1" (UID: "48e2c6f9-1502-4fa6-854d-ef25455dadb1"). InnerVolumeSpecName "kube-api-access-4sd8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.832033 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.832082 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sd8m\" (UniqueName: \"kubernetes.io/projected/48e2c6f9-1502-4fa6-854d-ef25455dadb1-kube-api-access-4sd8m\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.832094 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.832105 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e2c6f9-1502-4fa6-854d-ef25455dadb1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:04 crc kubenswrapper[4922]: I0929 09:56:04.832115 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48e2c6f9-1502-4fa6-854d-ef25455dadb1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293537 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4zgtm_48e2c6f9-1502-4fa6-854d-ef25455dadb1/console/0.log" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293597 4922 generic.go:334] "Generic (PLEG): container finished" podID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerID="ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9" exitCode=2 Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293631 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4zgtm" event={"ID":"48e2c6f9-1502-4fa6-854d-ef25455dadb1","Type":"ContainerDied","Data":"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9"} Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293657 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4zgtm" event={"ID":"48e2c6f9-1502-4fa6-854d-ef25455dadb1","Type":"ContainerDied","Data":"648344bb67c4c2d01c05bb54d1405ace830c6ac7d986664f00917e1afa007c4f"} Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4zgtm" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.293672 4922 scope.go:117] "RemoveContainer" containerID="ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.336463 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.341238 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4zgtm"] Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.361341 4922 scope.go:117] "RemoveContainer" containerID="ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9" Sep 29 09:56:05 crc kubenswrapper[4922]: E0929 09:56:05.361995 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9\": container with ID starting with ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9 not found: ID does not exist" containerID="ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.362251 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9"} err="failed to get container status \"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9\": rpc error: code = NotFound desc = could not find container \"ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9\": container with ID starting with ea3340e03afc96971d05fd3b3c37e05c1dfeeff63d6872e43454bbc6d0ed08f9 not found: ID does not exist" Sep 29 09:56:05 crc kubenswrapper[4922]: I0929 09:56:05.460403 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" path="/var/lib/kubelet/pods/48e2c6f9-1502-4fa6-854d-ef25455dadb1/volumes" Sep 29 09:56:06 crc kubenswrapper[4922]: I0929 09:56:06.310345 4922 generic.go:334] "Generic (PLEG): container finished" podID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerID="33ccd8a7a1ff775a2cdad85aa1c982b09ff71020ead3ecf68f84d43f17012e27" exitCode=0 Sep 29 09:56:06 crc kubenswrapper[4922]: I0929 09:56:06.310434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" event={"ID":"abce2942-2d7c-4097-992b-3ca6aabdc6f1","Type":"ContainerDied","Data":"33ccd8a7a1ff775a2cdad85aa1c982b09ff71020ead3ecf68f84d43f17012e27"} Sep 29 09:56:07 crc kubenswrapper[4922]: I0929 09:56:07.321162 4922 generic.go:334] "Generic (PLEG): container finished" podID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerID="e907d2e68aad2e8fdd21bd621d3107ff671dd54ab3799b0015fb9d50c7fc2ed2" exitCode=0 Sep 29 09:56:07 crc kubenswrapper[4922]: I0929 09:56:07.321366 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" event={"ID":"abce2942-2d7c-4097-992b-3ca6aabdc6f1","Type":"ContainerDied","Data":"e907d2e68aad2e8fdd21bd621d3107ff671dd54ab3799b0015fb9d50c7fc2ed2"} Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.593128 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.694016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz2v5\" (UniqueName: \"kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5\") pod \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.694118 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle\") pod \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.694149 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util\") pod \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\" (UID: \"abce2942-2d7c-4097-992b-3ca6aabdc6f1\") " Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.696067 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle" (OuterVolumeSpecName: "bundle") pod "abce2942-2d7c-4097-992b-3ca6aabdc6f1" (UID: "abce2942-2d7c-4097-992b-3ca6aabdc6f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.701983 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5" (OuterVolumeSpecName: "kube-api-access-sz2v5") pod "abce2942-2d7c-4097-992b-3ca6aabdc6f1" (UID: "abce2942-2d7c-4097-992b-3ca6aabdc6f1"). InnerVolumeSpecName "kube-api-access-sz2v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.795652 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:08 crc kubenswrapper[4922]: I0929 09:56:08.795715 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz2v5\" (UniqueName: \"kubernetes.io/projected/abce2942-2d7c-4097-992b-3ca6aabdc6f1-kube-api-access-sz2v5\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:09 crc kubenswrapper[4922]: I0929 09:56:09.094595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util" (OuterVolumeSpecName: "util") pod "abce2942-2d7c-4097-992b-3ca6aabdc6f1" (UID: "abce2942-2d7c-4097-992b-3ca6aabdc6f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:56:09 crc kubenswrapper[4922]: I0929 09:56:09.103340 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abce2942-2d7c-4097-992b-3ca6aabdc6f1-util\") on node \"crc\" DevicePath \"\"" Sep 29 09:56:09 crc kubenswrapper[4922]: I0929 09:56:09.339583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" event={"ID":"abce2942-2d7c-4097-992b-3ca6aabdc6f1","Type":"ContainerDied","Data":"003b842cc505f88cc2f78cc3641c4b9f0beb17fdc5c79b51b8bb868642e881a8"} Sep 29 09:56:09 crc kubenswrapper[4922]: I0929 09:56:09.340097 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003b842cc505f88cc2f78cc3641c4b9f0beb17fdc5c79b51b8bb868642e881a8" Sep 29 09:56:09 crc kubenswrapper[4922]: I0929 09:56:09.339715 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.675450 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps"] Sep 29 09:56:18 crc kubenswrapper[4922]: E0929 09:56:18.677717 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.677809 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" Sep 29 09:56:18 crc kubenswrapper[4922]: E0929 09:56:18.677897 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="extract" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.677971 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="extract" Sep 29 09:56:18 crc kubenswrapper[4922]: E0929 09:56:18.678046 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="pull" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.678110 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="pull" Sep 29 09:56:18 crc kubenswrapper[4922]: E0929 09:56:18.678189 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="util" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.678258 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="util" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.678456 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e2c6f9-1502-4fa6-854d-ef25455dadb1" containerName="console" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.678536 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="abce2942-2d7c-4097-992b-3ca6aabdc6f1" containerName="extract" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.679335 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.687122 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.687622 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.687818 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.687991 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.688516 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zhjgx" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.703271 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps"] Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.753695 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-apiservice-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.753747 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-webhook-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.753803 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5bk\" (UniqueName: \"kubernetes.io/projected/74df0c3b-e3ed-4061-9fb2-a9a830974755-kube-api-access-9f5bk\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.855459 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-apiservice-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.855514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-webhook-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.855570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5bk\" (UniqueName: \"kubernetes.io/projected/74df0c3b-e3ed-4061-9fb2-a9a830974755-kube-api-access-9f5bk\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.866366 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-apiservice-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.872764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74df0c3b-e3ed-4061-9fb2-a9a830974755-webhook-cert\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.884537 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5bk\" (UniqueName: \"kubernetes.io/projected/74df0c3b-e3ed-4061-9fb2-a9a830974755-kube-api-access-9f5bk\") pod \"metallb-operator-controller-manager-6bfbf6b8fd-w44ps\" (UID: \"74df0c3b-e3ed-4061-9fb2-a9a830974755\") " pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.919963 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz"] Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.920656 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.923587 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.923869 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-k7m59" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.925266 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 29 09:56:18 crc kubenswrapper[4922]: I0929 09:56:18.944370 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz"] Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.003532 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.060669 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvnw\" (UniqueName: \"kubernetes.io/projected/2d2222f4-496b-4cbf-883c-e3ac89e08a79-kube-api-access-drvnw\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.061518 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-webhook-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.061729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-apiservice-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.162964 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-webhook-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.163056 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-apiservice-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.163124 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drvnw\" (UniqueName: \"kubernetes.io/projected/2d2222f4-496b-4cbf-883c-e3ac89e08a79-kube-api-access-drvnw\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.174878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-webhook-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.175345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d2222f4-496b-4cbf-883c-e3ac89e08a79-apiservice-cert\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.192917 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvnw\" (UniqueName: \"kubernetes.io/projected/2d2222f4-496b-4cbf-883c-e3ac89e08a79-kube-api-access-drvnw\") pod \"metallb-operator-webhook-server-654f6f79d6-9gbmz\" (UID: \"2d2222f4-496b-4cbf-883c-e3ac89e08a79\") " pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.234737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.242277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps"] Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.409435 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" event={"ID":"74df0c3b-e3ed-4061-9fb2-a9a830974755","Type":"ContainerStarted","Data":"03a6a6790b55f11efe8d1282d35d3736c27963a5a49a13569862e328271d5f19"} Sep 29 09:56:19 crc kubenswrapper[4922]: I0929 09:56:19.474461 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz"] Sep 29 09:56:19 crc kubenswrapper[4922]: W0929 09:56:19.484796 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2222f4_496b_4cbf_883c_e3ac89e08a79.slice/crio-61fa44b2202c92690b30f1a0badd333aaccb987268d4456379ce5fef5062e99e WatchSource:0}: Error finding container 61fa44b2202c92690b30f1a0badd333aaccb987268d4456379ce5fef5062e99e: Status 404 returned error can't find the container with id 61fa44b2202c92690b30f1a0badd333aaccb987268d4456379ce5fef5062e99e Sep 29 09:56:20 crc kubenswrapper[4922]: I0929 09:56:20.418766 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" event={"ID":"2d2222f4-496b-4cbf-883c-e3ac89e08a79","Type":"ContainerStarted","Data":"61fa44b2202c92690b30f1a0badd333aaccb987268d4456379ce5fef5062e99e"} Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.471734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" event={"ID":"2d2222f4-496b-4cbf-883c-e3ac89e08a79","Type":"ContainerStarted","Data":"f9a2eca91343340b58497008ab061357bd3bdda38a7828c23e2257d653754270"} Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.472213 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.473235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" event={"ID":"74df0c3b-e3ed-4061-9fb2-a9a830974755","Type":"ContainerStarted","Data":"705b79280298ba91f0b9fe1681f618378277b611f9ebd87fd6e2b5f10bc8d5ae"} Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.473434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.529874 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" podStartSLOduration=2.56628165 podStartE2EDuration="7.529826641s" podCreationTimestamp="2025-09-29 09:56:18 +0000 UTC" firstStartedPulling="2025-09-29 09:56:19.489554877 +0000 UTC m=+704.855785151" lastFinishedPulling="2025-09-29 09:56:24.453099878 +0000 UTC m=+709.819330142" observedRunningTime="2025-09-29 09:56:25.526568966 +0000 UTC m=+710.892799230" watchObservedRunningTime="2025-09-29 09:56:25.529826641 +0000 UTC m=+710.896056905" Sep 29 09:56:25 crc kubenswrapper[4922]: I0929 09:56:25.551693 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" podStartSLOduration=2.383514725 podStartE2EDuration="7.551669286s" podCreationTimestamp="2025-09-29 09:56:18 +0000 UTC" firstStartedPulling="2025-09-29 09:56:19.265959988 +0000 UTC m=+704.632190252" lastFinishedPulling="2025-09-29 09:56:24.434114539 +0000 UTC m=+709.800344813" observedRunningTime="2025-09-29 09:56:25.547976929 +0000 UTC m=+710.914207233" watchObservedRunningTime="2025-09-29 09:56:25.551669286 +0000 UTC m=+710.917899550" Sep 29 09:56:29 crc kubenswrapper[4922]: I0929 09:56:29.070341 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:56:29 crc kubenswrapper[4922]: I0929 09:56:29.070867 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:56:39 crc kubenswrapper[4922]: I0929 09:56:39.241684 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-654f6f79d6-9gbmz" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.007882 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6bfbf6b8fd-w44ps" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.071493 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.071620 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.071714 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.072902 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.073018 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed" gracePeriod=600 Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.717359 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed" exitCode=0 Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.717469 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed"} Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.717736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217"} Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.717766 4922 scope.go:117] "RemoveContainer" containerID="6ec354693b7252058b868e7450deb01318aa6f043106e5484a5d126aed53e14b" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.774130 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-54pq5"] Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.779562 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.783377 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.783647 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.784046 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4mq9t" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.809921 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q"] Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.834683 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q"] Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.834883 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.844120 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.872503 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jp26n"] Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.873730 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jp26n" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.878477 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.878551 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.878768 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.878941 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hj59g" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.884801 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-vz5n2"] Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.887450 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.893095 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.903941 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-sockets\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.903997 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-startup\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.904021 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics-certs\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.904057 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.904088 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcrg\" (UniqueName: \"kubernetes.io/projected/41460288-0fe6-4f0f-ba8d-121ee673bf0d-kube-api-access-jhcrg\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.904137 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-reloader\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.904159 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-conf\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:56:59 crc kubenswrapper[4922]: I0929 09:56:59.907285 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-vz5n2"] Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005174 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-conf\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-cert\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p444v\" (UniqueName: \"kubernetes.io/projected/7732c258-d416-45bd-92a4-1a852c9bf4e6-kube-api-access-p444v\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b15f008-2077-4246-af46-d39384412fa5-metallb-excludel2\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005336 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005369 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-metrics-certs\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-sockets\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005409 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-startup\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005425 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics-certs\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgzt\" (UniqueName: \"kubernetes.io/projected/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-kube-api-access-wdgzt\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005471 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005486 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6fk\" (UniqueName: \"kubernetes.io/projected/3b15f008-2077-4246-af46-d39384412fa5-kube-api-access-2g6fk\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcrg\" (UniqueName: \"kubernetes.io/projected/41460288-0fe6-4f0f-ba8d-121ee673bf0d-kube-api-access-jhcrg\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005555 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.005573 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-reloader\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.006432 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-conf\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.006920 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-sockets\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.006963 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.008305 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/41460288-0fe6-4f0f-ba8d-121ee673bf0d-frr-startup\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.008503 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/41460288-0fe6-4f0f-ba8d-121ee673bf0d-reloader\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.014820 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41460288-0fe6-4f0f-ba8d-121ee673bf0d-metrics-certs\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.030718 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcrg\" (UniqueName: \"kubernetes.io/projected/41460288-0fe6-4f0f-ba8d-121ee673bf0d-kube-api-access-jhcrg\") pod \"frr-k8s-54pq5\" (UID: \"41460288-0fe6-4f0f-ba8d-121ee673bf0d\") " pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.106970 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-metrics-certs\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgzt\" (UniqueName: \"kubernetes.io/projected/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-kube-api-access-wdgzt\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107109 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6fk\" (UniqueName: \"kubernetes.io/projected/3b15f008-2077-4246-af46-d39384412fa5-kube-api-access-2g6fk\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107130 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107164 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107190 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-cert\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107207 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p444v\" (UniqueName: \"kubernetes.io/projected/7732c258-d416-45bd-92a4-1a852c9bf4e6-kube-api-access-p444v\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.107222 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b15f008-2077-4246-af46-d39384412fa5-metallb-excludel2\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.108019 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b15f008-2077-4246-af46-d39384412fa5-metallb-excludel2\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.108145 4922 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.108261 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs podName:7732c258-d416-45bd-92a4-1a852c9bf4e6 nodeName:}" failed. No retries permitted until 2025-09-29 09:57:00.608231427 +0000 UTC m=+745.974461761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs") pod "controller-5d688f5ffc-vz5n2" (UID: "7732c258-d416-45bd-92a4-1a852c9bf4e6") : secret "controller-certs-secret" not found Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.108948 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.109009 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist podName:3b15f008-2077-4246-af46-d39384412fa5 nodeName:}" failed. No retries permitted until 2025-09-29 09:57:00.608989737 +0000 UTC m=+745.975220001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist") pod "speaker-jp26n" (UID: "3b15f008-2077-4246-af46-d39384412fa5") : secret "metallb-memberlist" not found Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.113296 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-metrics-certs\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.113455 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.114178 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.115733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.124906 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-cert\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.137556 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6fk\" (UniqueName: \"kubernetes.io/projected/3b15f008-2077-4246-af46-d39384412fa5-kube-api-access-2g6fk\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.144682 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgzt\" (UniqueName: \"kubernetes.io/projected/fe4bf7c5-f4cf-4c2c-9075-14c39b06297d-kube-api-access-wdgzt\") pod \"frr-k8s-webhook-server-5478bdb765-p9x6q\" (UID: \"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.145003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p444v\" (UniqueName: \"kubernetes.io/projected/7732c258-d416-45bd-92a4-1a852c9bf4e6-kube-api-access-p444v\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.164540 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.411023 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q"] Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.614715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.616016 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.616254 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 09:57:00 crc kubenswrapper[4922]: E0929 09:57:00.617971 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist podName:3b15f008-2077-4246-af46-d39384412fa5 nodeName:}" failed. No retries permitted until 2025-09-29 09:57:01.616783791 +0000 UTC m=+746.983014085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist") pod "speaker-jp26n" (UID: "3b15f008-2077-4246-af46-d39384412fa5") : secret "metallb-memberlist" not found Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.627434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7732c258-d416-45bd-92a4-1a852c9bf4e6-metrics-certs\") pod \"controller-5d688f5ffc-vz5n2\" (UID: \"7732c258-d416-45bd-92a4-1a852c9bf4e6\") " pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.730031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" event={"ID":"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d","Type":"ContainerStarted","Data":"dc645c47827574213f18e1e9170c3a2e80cbeabcf951cfc130e5e9d019b779f4"} Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.731057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"d222af7d3e3f160aef457b50426270687c82f11b5604f0dfaafcce4b7e069be2"} Sep 29 09:57:00 crc kubenswrapper[4922]: I0929 09:57:00.822926 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.250574 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-vz5n2"] Sep 29 09:57:01 crc kubenswrapper[4922]: W0929 09:57:01.279139 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7732c258_d416_45bd_92a4_1a852c9bf4e6.slice/crio-67e8f03a3916473250ce36811e5dad72436f23690285b5ca79cef9b7bfac221b WatchSource:0}: Error finding container 67e8f03a3916473250ce36811e5dad72436f23690285b5ca79cef9b7bfac221b: Status 404 returned error can't find the container with id 67e8f03a3916473250ce36811e5dad72436f23690285b5ca79cef9b7bfac221b Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.637402 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.647613 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b15f008-2077-4246-af46-d39384412fa5-memberlist\") pod \"speaker-jp26n\" (UID: \"3b15f008-2077-4246-af46-d39384412fa5\") " pod="metallb-system/speaker-jp26n" Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.697927 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jp26n" Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.745119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-vz5n2" event={"ID":"7732c258-d416-45bd-92a4-1a852c9bf4e6","Type":"ContainerStarted","Data":"3c4ef83e54b95a3bba41c2a931097ac1f461bb8ac05c51d2182d76a225f12059"} Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.745187 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-vz5n2" event={"ID":"7732c258-d416-45bd-92a4-1a852c9bf4e6","Type":"ContainerStarted","Data":"a1762f7b93dea7d5778a2eddf163ca19d8035c8448092e70f3faa9a2380b546a"} Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.745199 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-vz5n2" event={"ID":"7732c258-d416-45bd-92a4-1a852c9bf4e6","Type":"ContainerStarted","Data":"67e8f03a3916473250ce36811e5dad72436f23690285b5ca79cef9b7bfac221b"} Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.745286 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.746543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp26n" event={"ID":"3b15f008-2077-4246-af46-d39384412fa5","Type":"ContainerStarted","Data":"940a656ca88ad208db6ebfdaf7c667df53728edec71c85e99674288306a6b436"} Sep 29 09:57:01 crc kubenswrapper[4922]: I0929 09:57:01.768100 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-vz5n2" podStartSLOduration=2.768071902 podStartE2EDuration="2.768071902s" podCreationTimestamp="2025-09-29 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:57:01.763303807 +0000 UTC m=+747.129534111" watchObservedRunningTime="2025-09-29 09:57:01.768071902 +0000 UTC m=+747.134302166" Sep 29 09:57:02 crc kubenswrapper[4922]: I0929 09:57:02.756125 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp26n" event={"ID":"3b15f008-2077-4246-af46-d39384412fa5","Type":"ContainerStarted","Data":"d1f95b17bb11ed678b75b8f82a34ec24111081e8cc2a05fabce3fe9843ad59fe"} Sep 29 09:57:02 crc kubenswrapper[4922]: I0929 09:57:02.756551 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp26n" event={"ID":"3b15f008-2077-4246-af46-d39384412fa5","Type":"ContainerStarted","Data":"d6174e00cea6206d52d56c1815bb1d248762d798a98c08cf883191cdd31b8a80"} Sep 29 09:57:02 crc kubenswrapper[4922]: I0929 09:57:02.756576 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jp26n" Sep 29 09:57:02 crc kubenswrapper[4922]: I0929 09:57:02.782736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jp26n" podStartSLOduration=3.782714075 podStartE2EDuration="3.782714075s" podCreationTimestamp="2025-09-29 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:57:02.777442515 +0000 UTC m=+748.143672779" watchObservedRunningTime="2025-09-29 09:57:02.782714075 +0000 UTC m=+748.148944339" Sep 29 09:57:08 crc kubenswrapper[4922]: I0929 09:57:08.804893 4922 generic.go:334] "Generic (PLEG): container finished" podID="41460288-0fe6-4f0f-ba8d-121ee673bf0d" containerID="5ed2b2af23be6da2906329e4af581fac8a3bc1f20bb1e25d16cbfb855878a7af" exitCode=0 Sep 29 09:57:08 crc kubenswrapper[4922]: I0929 09:57:08.805102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerDied","Data":"5ed2b2af23be6da2906329e4af581fac8a3bc1f20bb1e25d16cbfb855878a7af"} Sep 29 09:57:08 crc kubenswrapper[4922]: I0929 09:57:08.807797 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" event={"ID":"fe4bf7c5-f4cf-4c2c-9075-14c39b06297d","Type":"ContainerStarted","Data":"477a311178977fd30859f10f0f070fbbc09dbe407f5b169f7f25a39e93d51c1a"} Sep 29 09:57:08 crc kubenswrapper[4922]: I0929 09:57:08.808773 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:09 crc kubenswrapper[4922]: I0929 09:57:09.819437 4922 generic.go:334] "Generic (PLEG): container finished" podID="41460288-0fe6-4f0f-ba8d-121ee673bf0d" containerID="c92de3b1e9c42ead299ba9e69a04519ac2687512a05219d3e0b22edaf3ad38bc" exitCode=0 Sep 29 09:57:09 crc kubenswrapper[4922]: I0929 09:57:09.819533 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerDied","Data":"c92de3b1e9c42ead299ba9e69a04519ac2687512a05219d3e0b22edaf3ad38bc"} Sep 29 09:57:09 crc kubenswrapper[4922]: I0929 09:57:09.856480 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" podStartSLOduration=3.21109601 podStartE2EDuration="10.856437794s" podCreationTimestamp="2025-09-29 09:56:59 +0000 UTC" firstStartedPulling="2025-09-29 09:57:00.430199488 +0000 UTC m=+745.796429752" lastFinishedPulling="2025-09-29 09:57:08.075541282 +0000 UTC m=+753.441771536" observedRunningTime="2025-09-29 09:57:08.865324706 +0000 UTC m=+754.231555020" watchObservedRunningTime="2025-09-29 09:57:09.856437794 +0000 UTC m=+755.222668098" Sep 29 09:57:10 crc kubenswrapper[4922]: I0929 09:57:10.835588 4922 generic.go:334] "Generic (PLEG): container finished" podID="41460288-0fe6-4f0f-ba8d-121ee673bf0d" containerID="8cfeaaba72015bb67e4b65f7316d9978f4009cc3ea167a86f6078c031226996a" exitCode=0 Sep 29 09:57:10 crc kubenswrapper[4922]: I0929 09:57:10.835738 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerDied","Data":"8cfeaaba72015bb67e4b65f7316d9978f4009cc3ea167a86f6078c031226996a"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.412558 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.413313 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" podUID="f636765f-e16c-4597-88d7-327472ef1940" containerName="controller-manager" containerID="cri-o://58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c" gracePeriod=30 Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.494139 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.494422 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" podUID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" containerName="route-controller-manager" containerID="cri-o://7047a1f497040128e27786096f8580c1c34c6bec9073fc72418389ed678a0080" gracePeriod=30 Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.705996 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jp26n" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.857844 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.859527 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"817e12286b85322047ab551ccfad9d525908a3a50a5281ae969e6c83cfe4bffc"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.859588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"7526b6e8e28da7990b1818eaf0999f91efd85b641f4c4bca6c67bb7ece3f2c98"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.859601 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"793c1a1a51040f870a883d48a60aa51b209217e520937e6cac84b91332c0f1c1"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.859611 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"0a61418a638683c39d4be7512e7c062265fe57adebc69fb2d49f6bb7ea099ea9"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.868866 4922 generic.go:334] "Generic (PLEG): container finished" podID="f636765f-e16c-4597-88d7-327472ef1940" containerID="58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c" exitCode=0 Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.868998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" event={"ID":"f636765f-e16c-4597-88d7-327472ef1940","Type":"ContainerDied","Data":"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.869037 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" event={"ID":"f636765f-e16c-4597-88d7-327472ef1940","Type":"ContainerDied","Data":"96477c43d183599280dfad23d3fc5408f7a5d8551fe76027339997fac9b4be81"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.869061 4922 scope.go:117] "RemoveContainer" containerID="58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.869079 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4pzjz" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.871022 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert\") pod \"f636765f-e16c-4597-88d7-327472ef1940\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.871109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config\") pod \"f636765f-e16c-4597-88d7-327472ef1940\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.871142 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca\") pod \"f636765f-e16c-4597-88d7-327472ef1940\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.871179 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbvq\" (UniqueName: \"kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq\") pod \"f636765f-e16c-4597-88d7-327472ef1940\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.871210 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles\") pod \"f636765f-e16c-4597-88d7-327472ef1940\" (UID: \"f636765f-e16c-4597-88d7-327472ef1940\") " Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.872397 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f636765f-e16c-4597-88d7-327472ef1940" (UID: "f636765f-e16c-4597-88d7-327472ef1940"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.872925 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config" (OuterVolumeSpecName: "config") pod "f636765f-e16c-4597-88d7-327472ef1940" (UID: "f636765f-e16c-4597-88d7-327472ef1940"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.873363 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca" (OuterVolumeSpecName: "client-ca") pod "f636765f-e16c-4597-88d7-327472ef1940" (UID: "f636765f-e16c-4597-88d7-327472ef1940"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.882600 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f636765f-e16c-4597-88d7-327472ef1940" (UID: "f636765f-e16c-4597-88d7-327472ef1940"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.883497 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq" (OuterVolumeSpecName: "kube-api-access-fbbvq") pod "f636765f-e16c-4597-88d7-327472ef1940" (UID: "f636765f-e16c-4597-88d7-327472ef1940"). InnerVolumeSpecName "kube-api-access-fbbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.889351 4922 generic.go:334] "Generic (PLEG): container finished" podID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" containerID="7047a1f497040128e27786096f8580c1c34c6bec9073fc72418389ed678a0080" exitCode=0 Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.889413 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" event={"ID":"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd","Type":"ContainerDied","Data":"7047a1f497040128e27786096f8580c1c34c6bec9073fc72418389ed678a0080"} Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.905936 4922 scope.go:117] "RemoveContainer" containerID="58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c" Sep 29 09:57:11 crc kubenswrapper[4922]: E0929 09:57:11.906659 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c\": container with ID starting with 58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c not found: ID does not exist" containerID="58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.906738 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c"} err="failed to get container status \"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c\": rpc error: code = NotFound desc = could not find container \"58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c\": container with ID starting with 58fe229013c7d5f89951950ff0748be3a497856fb241d5a4e302c78bab7e919c not found: ID does not exist" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.972210 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.972237 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.972251 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbvq\" (UniqueName: \"kubernetes.io/projected/f636765f-e16c-4597-88d7-327472ef1940-kube-api-access-fbbvq\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.972262 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f636765f-e16c-4597-88d7-327472ef1940-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:11 crc kubenswrapper[4922]: I0929 09:57:11.972270 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f636765f-e16c-4597-88d7-327472ef1940-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.032141 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.073015 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config\") pod \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.073064 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8vtv\" (UniqueName: \"kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv\") pod \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.073119 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert\") pod \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.073159 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca\") pod \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\" (UID: \"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd\") " Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.074973 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca" (OuterVolumeSpecName: "client-ca") pod "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" (UID: "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.074988 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config" (OuterVolumeSpecName: "config") pod "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" (UID: "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.079507 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" (UID: "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.092211 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv" (OuterVolumeSpecName: "kube-api-access-c8vtv") pod "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" (UID: "a226dbbb-e39e-4fa1-aaab-1b28cffcfccd"). InnerVolumeSpecName "kube-api-access-c8vtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.175901 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.175946 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.175959 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.175974 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8vtv\" (UniqueName: \"kubernetes.io/projected/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd-kube-api-access-c8vtv\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.219970 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.225666 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4pzjz"] Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.366147 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7db7894697-cvwl8"] Sep 29 09:57:12 crc kubenswrapper[4922]: E0929 09:57:12.366489 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f636765f-e16c-4597-88d7-327472ef1940" containerName="controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.366509 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f636765f-e16c-4597-88d7-327472ef1940" containerName="controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: E0929 09:57:12.366529 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" containerName="route-controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.366535 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" containerName="route-controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.366652 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" containerName="route-controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.366671 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f636765f-e16c-4597-88d7-327472ef1940" containerName="controller-manager" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.367178 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.369247 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.369247 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.369817 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.370174 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.370466 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.370936 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.376265 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.382663 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7db7894697-cvwl8"] Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.408072 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql"] Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.409303 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.418019 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql"] Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.480721 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-config\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.480784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f414a46-f208-44ec-8cf6-d580bb6f121b-serving-cert\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.480810 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gzwm\" (UniqueName: \"kubernetes.io/projected/2f414a46-f208-44ec-8cf6-d580bb6f121b-kube-api-access-9gzwm\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.480974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034f2208-4cb4-468c-829d-052972715d56-serving-cert\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.480996 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87b6\" (UniqueName: \"kubernetes.io/projected/034f2208-4cb4-468c-829d-052972715d56-kube-api-access-t87b6\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.481017 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-client-ca\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.481035 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-client-ca\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.481074 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-proxy-ca-bundles\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.481094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-config\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.582481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-config\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.582546 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gzwm\" (UniqueName: \"kubernetes.io/projected/2f414a46-f208-44ec-8cf6-d580bb6f121b-kube-api-access-9gzwm\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.582580 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f414a46-f208-44ec-8cf6-d580bb6f121b-serving-cert\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034f2208-4cb4-468c-829d-052972715d56-serving-cert\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583816 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87b6\" (UniqueName: \"kubernetes.io/projected/034f2208-4cb4-468c-829d-052972715d56-kube-api-access-t87b6\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583858 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-client-ca\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583880 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-client-ca\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-proxy-ca-bundles\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.583949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-config\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.584342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-config\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.585013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/034f2208-4cb4-468c-829d-052972715d56-client-ca\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.585383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-config\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.585665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-proxy-ca-bundles\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.586811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f414a46-f208-44ec-8cf6-d580bb6f121b-client-ca\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.589708 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f414a46-f208-44ec-8cf6-d580bb6f121b-serving-cert\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.590534 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034f2208-4cb4-468c-829d-052972715d56-serving-cert\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.602613 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gzwm\" (UniqueName: \"kubernetes.io/projected/2f414a46-f208-44ec-8cf6-d580bb6f121b-kube-api-access-9gzwm\") pod \"controller-manager-7db7894697-cvwl8\" (UID: \"2f414a46-f208-44ec-8cf6-d580bb6f121b\") " pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.606547 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87b6\" (UniqueName: \"kubernetes.io/projected/034f2208-4cb4-468c-829d-052972715d56-kube-api-access-t87b6\") pod \"route-controller-manager-846f5f747-kgxql\" (UID: \"034f2208-4cb4-468c-829d-052972715d56\") " pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.685358 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.726536 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.911868 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"a63058ad20e089331032b16d5eff4e839ca13712bd3dca3290869136637f89e3"} Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.912558 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54pq5" event={"ID":"41460288-0fe6-4f0f-ba8d-121ee673bf0d","Type":"ContainerStarted","Data":"5df98708f4c8dec468136f2b2b8eca14906d944f5cbf4fc660ec0b28c626dd99"} Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.913017 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.932364 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" event={"ID":"a226dbbb-e39e-4fa1-aaab-1b28cffcfccd","Type":"ContainerDied","Data":"356cd216a1e1397703f89c8bd9746decfbe9d74f5e3161335059ada49c3552d0"} Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.932427 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.932439 4922 scope.go:117] "RemoveContainer" containerID="7047a1f497040128e27786096f8580c1c34c6bec9073fc72418389ed678a0080" Sep 29 09:57:12 crc kubenswrapper[4922]: I0929 09:57:12.968672 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-54pq5" podStartSLOduration=6.230401663 podStartE2EDuration="13.968645649s" podCreationTimestamp="2025-09-29 09:56:59 +0000 UTC" firstStartedPulling="2025-09-29 09:57:00.306190532 +0000 UTC m=+745.672420796" lastFinishedPulling="2025-09-29 09:57:08.044434518 +0000 UTC m=+753.410664782" observedRunningTime="2025-09-29 09:57:12.958740017 +0000 UTC m=+758.324970291" watchObservedRunningTime="2025-09-29 09:57:12.968645649 +0000 UTC m=+758.334875903" Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.005860 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.009886 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9kd4c"] Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.144541 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql"] Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.227455 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7db7894697-cvwl8"] Sep 29 09:57:13 crc kubenswrapper[4922]: W0929 09:57:13.242717 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f414a46_f208_44ec_8cf6_d580bb6f121b.slice/crio-70212a6a0a1f6d3e7dce9bdc22029ba8c077c979936feea5ea6451e28c71d385 WatchSource:0}: Error finding container 70212a6a0a1f6d3e7dce9bdc22029ba8c077c979936feea5ea6451e28c71d385: Status 404 returned error can't find the container with id 70212a6a0a1f6d3e7dce9bdc22029ba8c077c979936feea5ea6451e28c71d385 Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.463447 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a226dbbb-e39e-4fa1-aaab-1b28cffcfccd" path="/var/lib/kubelet/pods/a226dbbb-e39e-4fa1-aaab-1b28cffcfccd/volumes" Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.465394 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f636765f-e16c-4597-88d7-327472ef1940" path="/var/lib/kubelet/pods/f636765f-e16c-4597-88d7-327472ef1940/volumes" Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.941976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" event={"ID":"034f2208-4cb4-468c-829d-052972715d56","Type":"ContainerStarted","Data":"6abecb798459a461c99a5b7e519e3b098b4636fbf3605475f7f736afa143bcf3"} Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.942055 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" event={"ID":"034f2208-4cb4-468c-829d-052972715d56","Type":"ContainerStarted","Data":"1164960e4cd1dd87ccc62956908d66c9df799b8ed11b88eeb9d7b9bd728389b4"} Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.943717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" event={"ID":"2f414a46-f208-44ec-8cf6-d580bb6f121b","Type":"ContainerStarted","Data":"8e9712ad68e11e543629008e0014aac1b308d5180bfdb542e48a5fbe5ca95ea4"} Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.944104 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.944131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" event={"ID":"2f414a46-f208-44ec-8cf6-d580bb6f121b","Type":"ContainerStarted","Data":"70212a6a0a1f6d3e7dce9bdc22029ba8c077c979936feea5ea6451e28c71d385"} Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.965899 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" podStartSLOduration=1.9658750390000002 podStartE2EDuration="1.965875039s" podCreationTimestamp="2025-09-29 09:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:57:13.963130996 +0000 UTC m=+759.329361260" watchObservedRunningTime="2025-09-29 09:57:13.965875039 +0000 UTC m=+759.332105303" Sep 29 09:57:13 crc kubenswrapper[4922]: I0929 09:57:13.990768 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" podStartSLOduration=1.9907464780000002 podStartE2EDuration="1.990746478s" podCreationTimestamp="2025-09-29 09:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:57:13.986570988 +0000 UTC m=+759.352801252" watchObservedRunningTime="2025-09-29 09:57:13.990746478 +0000 UTC m=+759.356976742" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.377912 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-846f5f747-kgxql" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.890782 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.892333 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.895449 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.895474 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.907340 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.924733 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5ltw\" (UniqueName: \"kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw\") pod \"openstack-operator-index-jlxq7\" (UID: \"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f\") " pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.951624 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:14 crc kubenswrapper[4922]: I0929 09:57:14.967847 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7db7894697-cvwl8" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.027085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5ltw\" (UniqueName: \"kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw\") pod \"openstack-operator-index-jlxq7\" (UID: \"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f\") " pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.048586 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5ltw\" (UniqueName: \"kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw\") pod \"openstack-operator-index-jlxq7\" (UID: \"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f\") " pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.117487 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.166652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.208764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.733317 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:15 crc kubenswrapper[4922]: W0929 09:57:15.745855 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a13f35a_d0bf_4c08_bfa4_12ae61f4fe5f.slice/crio-ad7eec69684a8964b75d205bb657489f389fd4ef27fa6355ea34cda56358a5d9 WatchSource:0}: Error finding container ad7eec69684a8964b75d205bb657489f389fd4ef27fa6355ea34cda56358a5d9: Status 404 returned error can't find the container with id ad7eec69684a8964b75d205bb657489f389fd4ef27fa6355ea34cda56358a5d9 Sep 29 09:57:15 crc kubenswrapper[4922]: I0929 09:57:15.959708 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jlxq7" event={"ID":"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f","Type":"ContainerStarted","Data":"ad7eec69684a8964b75d205bb657489f389fd4ef27fa6355ea34cda56358a5d9"} Sep 29 09:57:16 crc kubenswrapper[4922]: I0929 09:57:16.660344 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.069266 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-8r27w"] Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.070510 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.075592 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4qcjs" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.088498 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8r27w"] Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.172067 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgcvf\" (UniqueName: \"kubernetes.io/projected/822c4c9d-3c4c-43db-a891-19b9db1d279b-kube-api-access-tgcvf\") pod \"openstack-operator-index-8r27w\" (UID: \"822c4c9d-3c4c-43db-a891-19b9db1d279b\") " pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.273654 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgcvf\" (UniqueName: \"kubernetes.io/projected/822c4c9d-3c4c-43db-a891-19b9db1d279b-kube-api-access-tgcvf\") pod \"openstack-operator-index-8r27w\" (UID: \"822c4c9d-3c4c-43db-a891-19b9db1d279b\") " pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.308308 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgcvf\" (UniqueName: \"kubernetes.io/projected/822c4c9d-3c4c-43db-a891-19b9db1d279b-kube-api-access-tgcvf\") pod \"openstack-operator-index-8r27w\" (UID: \"822c4c9d-3c4c-43db-a891-19b9db1d279b\") " pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.419679 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:17 crc kubenswrapper[4922]: I0929 09:57:17.760982 4922 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 09:57:19 crc kubenswrapper[4922]: I0929 09:57:19.386163 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8r27w"] Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.000569 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8r27w" event={"ID":"822c4c9d-3c4c-43db-a891-19b9db1d279b","Type":"ContainerStarted","Data":"6ac207b5db19f3b751868d6674f11fc14993b7f13a4ed39de2fd76ec03e54392"} Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.003015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8r27w" event={"ID":"822c4c9d-3c4c-43db-a891-19b9db1d279b","Type":"ContainerStarted","Data":"6f0e200a1ced8a40bdc2bcb396a98e2b19f2525e9d0d558587452c8ceec9304d"} Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.003399 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jlxq7" event={"ID":"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f","Type":"ContainerStarted","Data":"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f"} Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.003687 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jlxq7" podUID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" containerName="registry-server" containerID="cri-o://63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f" gracePeriod=2 Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.033761 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-8r27w" podStartSLOduration=2.966441448 podStartE2EDuration="3.033739051s" podCreationTimestamp="2025-09-29 09:57:17 +0000 UTC" firstStartedPulling="2025-09-29 09:57:19.402318732 +0000 UTC m=+764.768548996" lastFinishedPulling="2025-09-29 09:57:19.469616325 +0000 UTC m=+764.835846599" observedRunningTime="2025-09-29 09:57:20.026304553 +0000 UTC m=+765.392534857" watchObservedRunningTime="2025-09-29 09:57:20.033739051 +0000 UTC m=+765.399969325" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.052087 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jlxq7" podStartSLOduration=2.876087232 podStartE2EDuration="6.052064236s" podCreationTimestamp="2025-09-29 09:57:14 +0000 UTC" firstStartedPulling="2025-09-29 09:57:15.749640508 +0000 UTC m=+761.115870772" lastFinishedPulling="2025-09-29 09:57:18.925617512 +0000 UTC m=+764.291847776" observedRunningTime="2025-09-29 09:57:20.051807359 +0000 UTC m=+765.418037633" watchObservedRunningTime="2025-09-29 09:57:20.052064236 +0000 UTC m=+765.418294520" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.171134 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p9x6q" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.558413 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.734171 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5ltw\" (UniqueName: \"kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw\") pod \"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f\" (UID: \"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f\") " Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.742573 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw" (OuterVolumeSpecName: "kube-api-access-l5ltw") pod "1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" (UID: "1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f"). InnerVolumeSpecName "kube-api-access-l5ltw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.830754 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-vz5n2" Sep 29 09:57:20 crc kubenswrapper[4922]: I0929 09:57:20.835985 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5ltw\" (UniqueName: \"kubernetes.io/projected/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f-kube-api-access-l5ltw\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.015915 4922 generic.go:334] "Generic (PLEG): container finished" podID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" containerID="63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f" exitCode=0 Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.015999 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jlxq7" event={"ID":"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f","Type":"ContainerDied","Data":"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f"} Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.016036 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jlxq7" Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.016091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jlxq7" event={"ID":"1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f","Type":"ContainerDied","Data":"ad7eec69684a8964b75d205bb657489f389fd4ef27fa6355ea34cda56358a5d9"} Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.016139 4922 scope.go:117] "RemoveContainer" containerID="63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f" Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.042076 4922 scope.go:117] "RemoveContainer" containerID="63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f" Sep 29 09:57:21 crc kubenswrapper[4922]: E0929 09:57:21.043081 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f\": container with ID starting with 63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f not found: ID does not exist" containerID="63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f" Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.043141 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f"} err="failed to get container status \"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f\": rpc error: code = NotFound desc = could not find container \"63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f\": container with ID starting with 63a5b52117bb08049a4068ad0dacd35dd83781f1b624c5db7c89db6ab839be6f not found: ID does not exist" Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.058923 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.063018 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jlxq7"] Sep 29 09:57:21 crc kubenswrapper[4922]: I0929 09:57:21.466428 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" path="/var/lib/kubelet/pods/1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f/volumes" Sep 29 09:57:27 crc kubenswrapper[4922]: I0929 09:57:27.420344 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:27 crc kubenswrapper[4922]: I0929 09:57:27.420877 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:27 crc kubenswrapper[4922]: I0929 09:57:27.462381 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:28 crc kubenswrapper[4922]: I0929 09:57:28.120774 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-8r27w" Sep 29 09:57:30 crc kubenswrapper[4922]: I0929 09:57:30.120318 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-54pq5" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.728332 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl"] Sep 29 09:57:33 crc kubenswrapper[4922]: E0929 09:57:33.729278 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" containerName="registry-server" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.729304 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" containerName="registry-server" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.729499 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a13f35a-d0bf-4c08-bfa4-12ae61f4fe5f" containerName="registry-server" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.731019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.734523 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8jjrw" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.746982 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl"] Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.875282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.876001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.876250 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99ff\" (UniqueName: \"kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.978489 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.979004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.979212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99ff\" (UniqueName: \"kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.980302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:33 crc kubenswrapper[4922]: I0929 09:57:33.980368 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:34 crc kubenswrapper[4922]: I0929 09:57:34.007634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99ff\" (UniqueName: \"kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff\") pod \"eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:34 crc kubenswrapper[4922]: I0929 09:57:34.074428 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:34 crc kubenswrapper[4922]: I0929 09:57:34.575187 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl"] Sep 29 09:57:34 crc kubenswrapper[4922]: W0929 09:57:34.583452 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce3f41_6af5_42e1_9712_ad73b0089ad9.slice/crio-1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1 WatchSource:0}: Error finding container 1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1: Status 404 returned error can't find the container with id 1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1 Sep 29 09:57:34 crc kubenswrapper[4922]: E0929 09:57:34.879200 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce3f41_6af5_42e1_9712_ad73b0089ad9.slice/crio-26c6314610c0ae4d7a6587cce97cf2b31e60accf0f0d6dc685be50bd44f48e23.scope\": RecentStats: unable to find data in memory cache]" Sep 29 09:57:35 crc kubenswrapper[4922]: I0929 09:57:35.138958 4922 generic.go:334] "Generic (PLEG): container finished" podID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerID="26c6314610c0ae4d7a6587cce97cf2b31e60accf0f0d6dc685be50bd44f48e23" exitCode=0 Sep 29 09:57:35 crc kubenswrapper[4922]: I0929 09:57:35.139045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" event={"ID":"46ce3f41-6af5-42e1-9712-ad73b0089ad9","Type":"ContainerDied","Data":"26c6314610c0ae4d7a6587cce97cf2b31e60accf0f0d6dc685be50bd44f48e23"} Sep 29 09:57:35 crc kubenswrapper[4922]: I0929 09:57:35.139100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" event={"ID":"46ce3f41-6af5-42e1-9712-ad73b0089ad9","Type":"ContainerStarted","Data":"1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1"} Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.089764 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.091875 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.106211 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.245869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scn4d\" (UniqueName: \"kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.245931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.245968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.347423 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scn4d\" (UniqueName: \"kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.347471 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.347532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.348045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.348154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.377804 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scn4d\" (UniqueName: \"kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d\") pod \"certified-operators-ht225\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.452356 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:37 crc kubenswrapper[4922]: I0929 09:57:37.940021 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:37 crc kubenswrapper[4922]: W0929 09:57:37.947652 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3226513b_53c1_41bd_8d71_e866ccf84cdc.slice/crio-94cb170bcc303ad07918515fd14b2c0c1e9e754228cc046506e41532e3e56bf5 WatchSource:0}: Error finding container 94cb170bcc303ad07918515fd14b2c0c1e9e754228cc046506e41532e3e56bf5: Status 404 returned error can't find the container with id 94cb170bcc303ad07918515fd14b2c0c1e9e754228cc046506e41532e3e56bf5 Sep 29 09:57:38 crc kubenswrapper[4922]: I0929 09:57:38.170679 4922 generic.go:334] "Generic (PLEG): container finished" podID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerID="7fad45eff62db36cc61297b11f7acf640e303170cd831bba50e224cba8d37efb" exitCode=0 Sep 29 09:57:38 crc kubenswrapper[4922]: I0929 09:57:38.170743 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" event={"ID":"46ce3f41-6af5-42e1-9712-ad73b0089ad9","Type":"ContainerDied","Data":"7fad45eff62db36cc61297b11f7acf640e303170cd831bba50e224cba8d37efb"} Sep 29 09:57:38 crc kubenswrapper[4922]: I0929 09:57:38.179403 4922 generic.go:334] "Generic (PLEG): container finished" podID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerID="a823ad598046e8e7d47f7cc2f82e3d591ecdad4d16d73b2e4986a6d1c887d72b" exitCode=0 Sep 29 09:57:38 crc kubenswrapper[4922]: I0929 09:57:38.179455 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerDied","Data":"a823ad598046e8e7d47f7cc2f82e3d591ecdad4d16d73b2e4986a6d1c887d72b"} Sep 29 09:57:38 crc kubenswrapper[4922]: I0929 09:57:38.179489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerStarted","Data":"94cb170bcc303ad07918515fd14b2c0c1e9e754228cc046506e41532e3e56bf5"} Sep 29 09:57:39 crc kubenswrapper[4922]: I0929 09:57:39.201710 4922 generic.go:334] "Generic (PLEG): container finished" podID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerID="73215d13290b9c6197189cc7ebd09bea054191e400d793fa670906a77e5d1091" exitCode=0 Sep 29 09:57:39 crc kubenswrapper[4922]: I0929 09:57:39.202367 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" event={"ID":"46ce3f41-6af5-42e1-9712-ad73b0089ad9","Type":"ContainerDied","Data":"73215d13290b9c6197189cc7ebd09bea054191e400d793fa670906a77e5d1091"} Sep 29 09:57:39 crc kubenswrapper[4922]: I0929 09:57:39.219245 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerStarted","Data":"a5553cea9d9e94911814f8272cc51e132fa11fd08aa243267a143494feeeee0e"} Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.231625 4922 generic.go:334] "Generic (PLEG): container finished" podID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerID="a5553cea9d9e94911814f8272cc51e132fa11fd08aa243267a143494feeeee0e" exitCode=0 Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.231736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerDied","Data":"a5553cea9d9e94911814f8272cc51e132fa11fd08aa243267a143494feeeee0e"} Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.639855 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.736853 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle\") pod \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.736980 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99ff\" (UniqueName: \"kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff\") pod \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.737107 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util\") pod \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\" (UID: \"46ce3f41-6af5-42e1-9712-ad73b0089ad9\") " Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.738107 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle" (OuterVolumeSpecName: "bundle") pod "46ce3f41-6af5-42e1-9712-ad73b0089ad9" (UID: "46ce3f41-6af5-42e1-9712-ad73b0089ad9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.743757 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff" (OuterVolumeSpecName: "kube-api-access-l99ff") pod "46ce3f41-6af5-42e1-9712-ad73b0089ad9" (UID: "46ce3f41-6af5-42e1-9712-ad73b0089ad9"). InnerVolumeSpecName "kube-api-access-l99ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.752777 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util" (OuterVolumeSpecName: "util") pod "46ce3f41-6af5-42e1-9712-ad73b0089ad9" (UID: "46ce3f41-6af5-42e1-9712-ad73b0089ad9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.839541 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.839595 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99ff\" (UniqueName: \"kubernetes.io/projected/46ce3f41-6af5-42e1-9712-ad73b0089ad9-kube-api-access-l99ff\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:40 crc kubenswrapper[4922]: I0929 09:57:40.839618 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46ce3f41-6af5-42e1-9712-ad73b0089ad9-util\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:41 crc kubenswrapper[4922]: I0929 09:57:41.244147 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerStarted","Data":"a302014d676d0b00d19f898a5cdde17aafa9e43f5cfe2740496ecc8279afb8b7"} Sep 29 09:57:41 crc kubenswrapper[4922]: I0929 09:57:41.247406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" event={"ID":"46ce3f41-6af5-42e1-9712-ad73b0089ad9","Type":"ContainerDied","Data":"1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1"} Sep 29 09:57:41 crc kubenswrapper[4922]: I0929 09:57:41.247465 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c7ef9d02cdf129eb0845177331a4e5cfc4275abe37bf4f48697fa33e16f54e1" Sep 29 09:57:41 crc kubenswrapper[4922]: I0929 09:57:41.247460 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl" Sep 29 09:57:41 crc kubenswrapper[4922]: I0929 09:57:41.282199 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ht225" podStartSLOduration=1.642982521 podStartE2EDuration="4.282167284s" podCreationTimestamp="2025-09-29 09:57:37 +0000 UTC" firstStartedPulling="2025-09-29 09:57:38.189323562 +0000 UTC m=+783.555553826" lastFinishedPulling="2025-09-29 09:57:40.828508305 +0000 UTC m=+786.194738589" observedRunningTime="2025-09-29 09:57:41.275787014 +0000 UTC m=+786.642017288" watchObservedRunningTime="2025-09-29 09:57:41.282167284 +0000 UTC m=+786.648397568" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.672904 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:57:43 crc kubenswrapper[4922]: E0929 09:57:43.673506 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="util" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.673522 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="util" Sep 29 09:57:43 crc kubenswrapper[4922]: E0929 09:57:43.673534 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="pull" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.673540 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="pull" Sep 29 09:57:43 crc kubenswrapper[4922]: E0929 09:57:43.673553 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="extract" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.673559 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="extract" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.673691 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ce3f41-6af5-42e1-9712-ad73b0089ad9" containerName="extract" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.674646 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.716144 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.793152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp8lm\" (UniqueName: \"kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.793246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.793303 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.895043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp8lm\" (UniqueName: \"kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.895118 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.895172 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.896156 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.896158 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:43 crc kubenswrapper[4922]: I0929 09:57:43.919136 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp8lm\" (UniqueName: \"kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm\") pod \"redhat-operators-tz8sh\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:44 crc kubenswrapper[4922]: I0929 09:57:44.012401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:44 crc kubenswrapper[4922]: I0929 09:57:44.546247 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:57:44 crc kubenswrapper[4922]: W0929 09:57:44.563582 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda05ef8b7_a255_471a_af02_1b69fc0cfa67.slice/crio-71551af03e8b6b5406ab3c7b02c7c4467fd8e36915d692262d8ed00afb2542ba WatchSource:0}: Error finding container 71551af03e8b6b5406ab3c7b02c7c4467fd8e36915d692262d8ed00afb2542ba: Status 404 returned error can't find the container with id 71551af03e8b6b5406ab3c7b02c7c4467fd8e36915d692262d8ed00afb2542ba Sep 29 09:57:45 crc kubenswrapper[4922]: E0929 09:57:45.058480 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda05ef8b7_a255_471a_af02_1b69fc0cfa67.slice/crio-conmon-e33562f6433d57d6e72a0f5239334a9e5fde7af481fe855c3934f140bf5c06f9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda05ef8b7_a255_471a_af02_1b69fc0cfa67.slice/crio-e33562f6433d57d6e72a0f5239334a9e5fde7af481fe855c3934f140bf5c06f9.scope\": RecentStats: unable to find data in memory cache]" Sep 29 09:57:45 crc kubenswrapper[4922]: I0929 09:57:45.286735 4922 generic.go:334] "Generic (PLEG): container finished" podID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerID="e33562f6433d57d6e72a0f5239334a9e5fde7af481fe855c3934f140bf5c06f9" exitCode=0 Sep 29 09:57:45 crc kubenswrapper[4922]: I0929 09:57:45.286796 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerDied","Data":"e33562f6433d57d6e72a0f5239334a9e5fde7af481fe855c3934f140bf5c06f9"} Sep 29 09:57:45 crc kubenswrapper[4922]: I0929 09:57:45.286855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerStarted","Data":"71551af03e8b6b5406ab3c7b02c7c4467fd8e36915d692262d8ed00afb2542ba"} Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.005513 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl"] Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.007157 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.012025 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-vqrps" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.035112 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nsp\" (UniqueName: \"kubernetes.io/projected/77c866e5-8ec4-47ac-809c-0fc002c47957-kube-api-access-89nsp\") pod \"openstack-operator-controller-operator-7484b66f-slfdl\" (UID: \"77c866e5-8ec4-47ac-809c-0fc002c47957\") " pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.083661 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl"] Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.136938 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89nsp\" (UniqueName: \"kubernetes.io/projected/77c866e5-8ec4-47ac-809c-0fc002c47957-kube-api-access-89nsp\") pod \"openstack-operator-controller-operator-7484b66f-slfdl\" (UID: \"77c866e5-8ec4-47ac-809c-0fc002c47957\") " pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.165162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nsp\" (UniqueName: \"kubernetes.io/projected/77c866e5-8ec4-47ac-809c-0fc002c47957-kube-api-access-89nsp\") pod \"openstack-operator-controller-operator-7484b66f-slfdl\" (UID: \"77c866e5-8ec4-47ac-809c-0fc002c47957\") " pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.295627 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerStarted","Data":"ed1e2c323fa8a5894acc381ece339f0fdcd79b92446a5ac4196ec3d0b3e4425b"} Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.328273 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:46 crc kubenswrapper[4922]: I0929 09:57:46.790826 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl"] Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.310751 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" event={"ID":"77c866e5-8ec4-47ac-809c-0fc002c47957","Type":"ContainerStarted","Data":"94ad7e9c66464ae5571d9a04c29f677d716f29921522d5b5e955642493ffe05b"} Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.314427 4922 generic.go:334] "Generic (PLEG): container finished" podID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerID="ed1e2c323fa8a5894acc381ece339f0fdcd79b92446a5ac4196ec3d0b3e4425b" exitCode=0 Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.314492 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerDied","Data":"ed1e2c323fa8a5894acc381ece339f0fdcd79b92446a5ac4196ec3d0b3e4425b"} Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.474813 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.475739 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:47 crc kubenswrapper[4922]: I0929 09:57:47.519071 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:48 crc kubenswrapper[4922]: I0929 09:57:48.330290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerStarted","Data":"c8af49880113d6aaf6939bedb41408406492eb4185456919ee5686a280475d84"} Sep 29 09:57:48 crc kubenswrapper[4922]: I0929 09:57:48.356443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tz8sh" podStartSLOduration=2.6836495769999997 podStartE2EDuration="5.356422519s" podCreationTimestamp="2025-09-29 09:57:43 +0000 UTC" firstStartedPulling="2025-09-29 09:57:45.289276568 +0000 UTC m=+790.655506832" lastFinishedPulling="2025-09-29 09:57:47.96204951 +0000 UTC m=+793.328279774" observedRunningTime="2025-09-29 09:57:48.349739571 +0000 UTC m=+793.715969875" watchObservedRunningTime="2025-09-29 09:57:48.356422519 +0000 UTC m=+793.722652793" Sep 29 09:57:48 crc kubenswrapper[4922]: I0929 09:57:48.391500 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.060979 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.061636 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ht225" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="registry-server" containerID="cri-o://a302014d676d0b00d19f898a5cdde17aafa9e43f5cfe2740496ecc8279afb8b7" gracePeriod=2 Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.365344 4922 generic.go:334] "Generic (PLEG): container finished" podID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerID="a302014d676d0b00d19f898a5cdde17aafa9e43f5cfe2740496ecc8279afb8b7" exitCode=0 Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.365441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerDied","Data":"a302014d676d0b00d19f898a5cdde17aafa9e43f5cfe2740496ecc8279afb8b7"} Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.684196 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.862762 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities\") pod \"3226513b-53c1-41bd-8d71-e866ccf84cdc\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.862886 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scn4d\" (UniqueName: \"kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d\") pod \"3226513b-53c1-41bd-8d71-e866ccf84cdc\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.863027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content\") pod \"3226513b-53c1-41bd-8d71-e866ccf84cdc\" (UID: \"3226513b-53c1-41bd-8d71-e866ccf84cdc\") " Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.864658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities" (OuterVolumeSpecName: "utilities") pod "3226513b-53c1-41bd-8d71-e866ccf84cdc" (UID: "3226513b-53c1-41bd-8d71-e866ccf84cdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.873234 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d" (OuterVolumeSpecName: "kube-api-access-scn4d") pod "3226513b-53c1-41bd-8d71-e866ccf84cdc" (UID: "3226513b-53c1-41bd-8d71-e866ccf84cdc"). InnerVolumeSpecName "kube-api-access-scn4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.932440 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3226513b-53c1-41bd-8d71-e866ccf84cdc" (UID: "3226513b-53c1-41bd-8d71-e866ccf84cdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.964941 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.965001 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scn4d\" (UniqueName: \"kubernetes.io/projected/3226513b-53c1-41bd-8d71-e866ccf84cdc-kube-api-access-scn4d\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:51 crc kubenswrapper[4922]: I0929 09:57:51.965023 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3226513b-53c1-41bd-8d71-e866ccf84cdc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.377557 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ht225" Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.377580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ht225" event={"ID":"3226513b-53c1-41bd-8d71-e866ccf84cdc","Type":"ContainerDied","Data":"94cb170bcc303ad07918515fd14b2c0c1e9e754228cc046506e41532e3e56bf5"} Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.377770 4922 scope.go:117] "RemoveContainer" containerID="a302014d676d0b00d19f898a5cdde17aafa9e43f5cfe2740496ecc8279afb8b7" Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.380785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" event={"ID":"77c866e5-8ec4-47ac-809c-0fc002c47957","Type":"ContainerStarted","Data":"88f57f07374a8914e3d20ea50f81abcaa247e98815bccd9074edcd389d69e220"} Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.400410 4922 scope.go:117] "RemoveContainer" containerID="a5553cea9d9e94911814f8272cc51e132fa11fd08aa243267a143494feeeee0e" Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.416288 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.422049 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ht225"] Sep 29 09:57:52 crc kubenswrapper[4922]: I0929 09:57:52.444681 4922 scope.go:117] "RemoveContainer" containerID="a823ad598046e8e7d47f7cc2f82e3d591ecdad4d16d73b2e4986a6d1c887d72b" Sep 29 09:57:53 crc kubenswrapper[4922]: I0929 09:57:53.459106 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" path="/var/lib/kubelet/pods/3226513b-53c1-41bd-8d71-e866ccf84cdc/volumes" Sep 29 09:57:54 crc kubenswrapper[4922]: I0929 09:57:54.013196 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:54 crc kubenswrapper[4922]: I0929 09:57:54.013270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:54 crc kubenswrapper[4922]: I0929 09:57:54.102852 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:54 crc kubenswrapper[4922]: I0929 09:57:54.459685 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:57:55 crc kubenswrapper[4922]: I0929 09:57:55.407367 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" event={"ID":"77c866e5-8ec4-47ac-809c-0fc002c47957","Type":"ContainerStarted","Data":"a6e4c0b9de782dfc44d6521e934a67334995d5df892fb84f6bed65a739a81410"} Sep 29 09:57:55 crc kubenswrapper[4922]: I0929 09:57:55.446130 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" podStartSLOduration=2.451991917 podStartE2EDuration="10.446097462s" podCreationTimestamp="2025-09-29 09:57:45 +0000 UTC" firstStartedPulling="2025-09-29 09:57:46.796921711 +0000 UTC m=+792.163151975" lastFinishedPulling="2025-09-29 09:57:54.791027256 +0000 UTC m=+800.157257520" observedRunningTime="2025-09-29 09:57:55.444748856 +0000 UTC m=+800.810979140" watchObservedRunningTime="2025-09-29 09:57:55.446097462 +0000 UTC m=+800.812327746" Sep 29 09:57:56 crc kubenswrapper[4922]: I0929 09:57:56.329016 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:57 crc kubenswrapper[4922]: I0929 09:57:57.432256 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7484b66f-slfdl" Sep 29 09:57:57 crc kubenswrapper[4922]: I0929 09:57:57.467302 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:57:57 crc kubenswrapper[4922]: I0929 09:57:57.467649 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tz8sh" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="registry-server" containerID="cri-o://c8af49880113d6aaf6939bedb41408406492eb4185456919ee5686a280475d84" gracePeriod=2 Sep 29 09:57:59 crc kubenswrapper[4922]: I0929 09:57:59.449219 4922 generic.go:334] "Generic (PLEG): container finished" podID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerID="c8af49880113d6aaf6939bedb41408406492eb4185456919ee5686a280475d84" exitCode=0 Sep 29 09:57:59 crc kubenswrapper[4922]: I0929 09:57:59.449342 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerDied","Data":"c8af49880113d6aaf6939bedb41408406492eb4185456919ee5686a280475d84"} Sep 29 09:57:59 crc kubenswrapper[4922]: I0929 09:57:59.978570 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.112016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities\") pod \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.112149 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp8lm\" (UniqueName: \"kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm\") pod \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.112227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content\") pod \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\" (UID: \"a05ef8b7-a255-471a-af02-1b69fc0cfa67\") " Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.113357 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities" (OuterVolumeSpecName: "utilities") pod "a05ef8b7-a255-471a-af02-1b69fc0cfa67" (UID: "a05ef8b7-a255-471a-af02-1b69fc0cfa67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.133402 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm" (OuterVolumeSpecName: "kube-api-access-fp8lm") pod "a05ef8b7-a255-471a-af02-1b69fc0cfa67" (UID: "a05ef8b7-a255-471a-af02-1b69fc0cfa67"). InnerVolumeSpecName "kube-api-access-fp8lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.214718 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp8lm\" (UniqueName: \"kubernetes.io/projected/a05ef8b7-a255-471a-af02-1b69fc0cfa67-kube-api-access-fp8lm\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.214772 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.233492 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a05ef8b7-a255-471a-af02-1b69fc0cfa67" (UID: "a05ef8b7-a255-471a-af02-1b69fc0cfa67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.317343 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a05ef8b7-a255-471a-af02-1b69fc0cfa67-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.467799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz8sh" event={"ID":"a05ef8b7-a255-471a-af02-1b69fc0cfa67","Type":"ContainerDied","Data":"71551af03e8b6b5406ab3c7b02c7c4467fd8e36915d692262d8ed00afb2542ba"} Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.467968 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz8sh" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.467988 4922 scope.go:117] "RemoveContainer" containerID="c8af49880113d6aaf6939bedb41408406492eb4185456919ee5686a280475d84" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.501204 4922 scope.go:117] "RemoveContainer" containerID="ed1e2c323fa8a5894acc381ece339f0fdcd79b92446a5ac4196ec3d0b3e4425b" Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.527071 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.534051 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tz8sh"] Sep 29 09:58:00 crc kubenswrapper[4922]: I0929 09:58:00.552028 4922 scope.go:117] "RemoveContainer" containerID="e33562f6433d57d6e72a0f5239334a9e5fde7af481fe855c3934f140bf5c06f9" Sep 29 09:58:01 crc kubenswrapper[4922]: I0929 09:58:01.463113 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" path="/var/lib/kubelet/pods/a05ef8b7-a255-471a-af02-1b69fc0cfa67/volumes" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.726528 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727269 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="extract-content" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727286 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="extract-content" Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727297 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="extract-content" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727303 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="extract-content" Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727313 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="extract-utilities" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727321 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="extract-utilities" Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727334 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727341 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727349 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727354 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: E0929 09:58:03.727363 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="extract-utilities" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727369 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="extract-utilities" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727499 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3226513b-53c1-41bd-8d71-e866ccf84cdc" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.727514 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05ef8b7-a255-471a-af02-1b69fc0cfa67" containerName="registry-server" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.728522 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.758277 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.774571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.774614 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljn22\" (UniqueName: \"kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.774666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.876235 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.876295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljn22\" (UniqueName: \"kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.876362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.876893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.876997 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:03 crc kubenswrapper[4922]: I0929 09:58:03.902485 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljn22\" (UniqueName: \"kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22\") pod \"redhat-marketplace-mlplc\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:04 crc kubenswrapper[4922]: I0929 09:58:04.045638 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:04 crc kubenswrapper[4922]: I0929 09:58:04.492386 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:04 crc kubenswrapper[4922]: I0929 09:58:04.533363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerStarted","Data":"afa5edcda38522ede53cccaf7805f312313b60765714c570c937f1f950f5ffee"} Sep 29 09:58:05 crc kubenswrapper[4922]: I0929 09:58:05.543964 4922 generic.go:334] "Generic (PLEG): container finished" podID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerID="6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618" exitCode=0 Sep 29 09:58:05 crc kubenswrapper[4922]: I0929 09:58:05.544045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerDied","Data":"6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618"} Sep 29 09:58:06 crc kubenswrapper[4922]: I0929 09:58:06.555165 4922 generic.go:334] "Generic (PLEG): container finished" podID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerID="ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f" exitCode=0 Sep 29 09:58:06 crc kubenswrapper[4922]: I0929 09:58:06.555283 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerDied","Data":"ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f"} Sep 29 09:58:07 crc kubenswrapper[4922]: I0929 09:58:07.564697 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerStarted","Data":"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956"} Sep 29 09:58:07 crc kubenswrapper[4922]: I0929 09:58:07.588187 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mlplc" podStartSLOduration=3.173659557 podStartE2EDuration="4.588166512s" podCreationTimestamp="2025-09-29 09:58:03 +0000 UTC" firstStartedPulling="2025-09-29 09:58:05.546094369 +0000 UTC m=+810.912324633" lastFinishedPulling="2025-09-29 09:58:06.960601324 +0000 UTC m=+812.326831588" observedRunningTime="2025-09-29 09:58:07.583571859 +0000 UTC m=+812.949802123" watchObservedRunningTime="2025-09-29 09:58:07.588166512 +0000 UTC m=+812.954396776" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.668791 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.670661 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.686790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.814306 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnhl\" (UniqueName: \"kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.814413 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.814447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.916094 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnhl\" (UniqueName: \"kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.916189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.916217 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.916789 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.917221 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:11 crc kubenswrapper[4922]: I0929 09:58:11.946592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnhl\" (UniqueName: \"kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl\") pod \"community-operators-5hprb\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:12 crc kubenswrapper[4922]: I0929 09:58:12.003630 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:12 crc kubenswrapper[4922]: I0929 09:58:12.616967 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:12 crc kubenswrapper[4922]: W0929 09:58:12.623431 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d0ac0f_4be9_485a_8869_21d69d8f86b4.slice/crio-86eeb67caa11f41e562042489c82e34e9871fccb4d46aee6244dd6c386ec6ba5 WatchSource:0}: Error finding container 86eeb67caa11f41e562042489c82e34e9871fccb4d46aee6244dd6c386ec6ba5: Status 404 returned error can't find the container with id 86eeb67caa11f41e562042489c82e34e9871fccb4d46aee6244dd6c386ec6ba5 Sep 29 09:58:13 crc kubenswrapper[4922]: I0929 09:58:13.614720 4922 generic.go:334] "Generic (PLEG): container finished" podID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerID="8a132fc939899ae225f0897bd38ffb391130544853682cbec9229583c83bb40a" exitCode=0 Sep 29 09:58:13 crc kubenswrapper[4922]: I0929 09:58:13.614780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerDied","Data":"8a132fc939899ae225f0897bd38ffb391130544853682cbec9229583c83bb40a"} Sep 29 09:58:13 crc kubenswrapper[4922]: I0929 09:58:13.615133 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerStarted","Data":"86eeb67caa11f41e562042489c82e34e9871fccb4d46aee6244dd6c386ec6ba5"} Sep 29 09:58:14 crc kubenswrapper[4922]: I0929 09:58:14.045887 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:14 crc kubenswrapper[4922]: I0929 09:58:14.045957 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:14 crc kubenswrapper[4922]: I0929 09:58:14.120001 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:14 crc kubenswrapper[4922]: I0929 09:58:14.625938 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerStarted","Data":"8c3149dc3b2ffe2db16e3c4b4b6aefb2908de1f36d4b4edcde1600ecb9096c66"} Sep 29 09:58:14 crc kubenswrapper[4922]: I0929 09:58:14.675149 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.029388 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.033065 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.041188 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c4tqk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.053612 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.062889 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.074240 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.080893 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.084822 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hgrt7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.089901 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.091538 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.094275 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mmhp6" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.098701 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.100111 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.103899 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vl8b6" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.141101 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.152225 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.164299 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.165393 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.168901 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.175248 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kj5m2" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.175467 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.176443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.177342 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blng8\" (UniqueName: \"kubernetes.io/projected/05dca5c7-0856-4c86-9bf8-99c6edc07252-kube-api-access-blng8\") pod \"cinder-operator-controller-manager-748c574d75-8vhzj\" (UID: \"05dca5c7-0856-4c86-9bf8-99c6edc07252\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.177380 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ssg\" (UniqueName: \"kubernetes.io/projected/733a9696-fd92-42d4-b4df-6e4ba3d9d433-kube-api-access-57ssg\") pod \"barbican-operator-controller-manager-6495d75b5-wxkn7\" (UID: \"733a9696-fd92-42d4-b4df-6e4ba3d9d433\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.179597 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r72b8" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.187160 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.192449 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.193443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.195352 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.195632 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-znm2v" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.204676 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.210632 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.212437 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.216086 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vqxht" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.241469 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.263058 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.264922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.271778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6xg5p" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.280949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blng8\" (UniqueName: \"kubernetes.io/projected/05dca5c7-0856-4c86-9bf8-99c6edc07252-kube-api-access-blng8\") pod \"cinder-operator-controller-manager-748c574d75-8vhzj\" (UID: \"05dca5c7-0856-4c86-9bf8-99c6edc07252\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.280999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t84t\" (UniqueName: \"kubernetes.io/projected/aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0-kube-api-access-7t84t\") pod \"horizon-operator-controller-manager-695847bc78-ft79c\" (UID: \"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.281021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ssg\" (UniqueName: \"kubernetes.io/projected/733a9696-fd92-42d4-b4df-6e4ba3d9d433-kube-api-access-57ssg\") pod \"barbican-operator-controller-manager-6495d75b5-wxkn7\" (UID: \"733a9696-fd92-42d4-b4df-6e4ba3d9d433\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.281061 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpnh\" (UniqueName: \"kubernetes.io/projected/2536f9c0-aac9-4d2c-be19-8afe9ac2e418-kube-api-access-rlpnh\") pod \"designate-operator-controller-manager-7d74f4d695-45kw4\" (UID: \"2536f9c0-aac9-4d2c-be19-8afe9ac2e418\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.281086 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmps\" (UniqueName: \"kubernetes.io/projected/0ed6eee8-7938-4f36-98f8-99af2cc40a4e-kube-api-access-tpmps\") pod \"glance-operator-controller-manager-67b5d44b7f-ph6mk\" (UID: \"0ed6eee8-7938-4f36-98f8-99af2cc40a4e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.281105 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hksgd\" (UniqueName: \"kubernetes.io/projected/88c2443d-e9bf-441b-ae76-93b7f63c790b-kube-api-access-hksgd\") pod \"heat-operator-controller-manager-8ff95898-jhr5q\" (UID: \"88c2443d-e9bf-441b-ae76-93b7f63c790b\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.284678 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.285935 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.292938 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ng8bf" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.296881 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.298006 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.305295 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hpj84" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.336058 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ssg\" (UniqueName: \"kubernetes.io/projected/733a9696-fd92-42d4-b4df-6e4ba3d9d433-kube-api-access-57ssg\") pod \"barbican-operator-controller-manager-6495d75b5-wxkn7\" (UID: \"733a9696-fd92-42d4-b4df-6e4ba3d9d433\") " pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.336049 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blng8\" (UniqueName: \"kubernetes.io/projected/05dca5c7-0856-4c86-9bf8-99c6edc07252-kube-api-access-blng8\") pod \"cinder-operator-controller-manager-748c574d75-8vhzj\" (UID: \"05dca5c7-0856-4c86-9bf8-99c6edc07252\") " pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.339555 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.356055 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.357716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.362347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.371951 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bhpbm" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.383977 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpnh\" (UniqueName: \"kubernetes.io/projected/2536f9c0-aac9-4d2c-be19-8afe9ac2e418-kube-api-access-rlpnh\") pod \"designate-operator-controller-manager-7d74f4d695-45kw4\" (UID: \"2536f9c0-aac9-4d2c-be19-8afe9ac2e418\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpmps\" (UniqueName: \"kubernetes.io/projected/0ed6eee8-7938-4f36-98f8-99af2cc40a4e-kube-api-access-tpmps\") pod \"glance-operator-controller-manager-67b5d44b7f-ph6mk\" (UID: \"0ed6eee8-7938-4f36-98f8-99af2cc40a4e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384073 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hksgd\" (UniqueName: \"kubernetes.io/projected/88c2443d-e9bf-441b-ae76-93b7f63c790b-kube-api-access-hksgd\") pod \"heat-operator-controller-manager-8ff95898-jhr5q\" (UID: \"88c2443d-e9bf-441b-ae76-93b7f63c790b\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384119 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkc6\" (UniqueName: \"kubernetes.io/projected/39c6dedb-23e2-4515-83c8-1e85e0136cc8-kube-api-access-rgkc6\") pod \"keystone-operator-controller-manager-7bf498966c-xns9g\" (UID: \"39c6dedb-23e2-4515-83c8-1e85e0136cc8\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384140 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbbt\" (UniqueName: \"kubernetes.io/projected/698d9305-27b8-44fe-bcd9-f034bdfa9b09-kube-api-access-6fbbt\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t84t\" (UniqueName: \"kubernetes.io/projected/aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0-kube-api-access-7t84t\") pod \"horizon-operator-controller-manager-695847bc78-ft79c\" (UID: \"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.384192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5f5\" (UniqueName: \"kubernetes.io/projected/fe4d01cb-1457-4cca-b2ed-7da6250a47df-kube-api-access-mc5f5\") pod \"ironic-operator-controller-manager-9fc8d5567-jbs5x\" (UID: \"fe4d01cb-1457-4cca-b2ed-7da6250a47df\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.388951 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.399422 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.410343 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.419647 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.436302 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.437566 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.437682 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.442396 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jhtd5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.450978 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.451816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t84t\" (UniqueName: \"kubernetes.io/projected/aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0-kube-api-access-7t84t\") pod \"horizon-operator-controller-manager-695847bc78-ft79c\" (UID: \"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0\") " pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.465555 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.490753 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpnh\" (UniqueName: \"kubernetes.io/projected/2536f9c0-aac9-4d2c-be19-8afe9ac2e418-kube-api-access-rlpnh\") pod \"designate-operator-controller-manager-7d74f4d695-45kw4\" (UID: \"2536f9c0-aac9-4d2c-be19-8afe9ac2e418\") " pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.496265 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpmps\" (UniqueName: \"kubernetes.io/projected/0ed6eee8-7938-4f36-98f8-99af2cc40a4e-kube-api-access-tpmps\") pod \"glance-operator-controller-manager-67b5d44b7f-ph6mk\" (UID: \"0ed6eee8-7938-4f36-98f8-99af2cc40a4e\") " pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.497400 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hksgd\" (UniqueName: \"kubernetes.io/projected/88c2443d-e9bf-441b-ae76-93b7f63c790b-kube-api-access-hksgd\") pod \"heat-operator-controller-manager-8ff95898-jhr5q\" (UID: \"88c2443d-e9bf-441b-ae76-93b7f63c790b\") " pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.501281 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.501317 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tkckh" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.502725 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.502881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhrmh\" (UniqueName: \"kubernetes.io/projected/a01ec1f8-817f-4ed8-9431-01847d4956be-kube-api-access-fhrmh\") pod \"nova-operator-controller-manager-c7c776c96-knzwz\" (UID: \"a01ec1f8-817f-4ed8-9431-01847d4956be\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.502978 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxb55\" (UniqueName: \"kubernetes.io/projected/8ad2a8b0-1e70-47e2-80a1-139eedb15541-kube-api-access-mxb55\") pod \"neutron-operator-controller-manager-54d766c9f9-r4lcr\" (UID: \"8ad2a8b0-1e70-47e2-80a1-139eedb15541\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.503038 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkc6\" (UniqueName: \"kubernetes.io/projected/39c6dedb-23e2-4515-83c8-1e85e0136cc8-kube-api-access-rgkc6\") pod \"keystone-operator-controller-manager-7bf498966c-xns9g\" (UID: \"39c6dedb-23e2-4515-83c8-1e85e0136cc8\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.503093 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbbt\" (UniqueName: \"kubernetes.io/projected/698d9305-27b8-44fe-bcd9-f034bdfa9b09-kube-api-access-6fbbt\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.503160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5f5\" (UniqueName: \"kubernetes.io/projected/fe4d01cb-1457-4cca-b2ed-7da6250a47df-kube-api-access-mc5f5\") pod \"ironic-operator-controller-manager-9fc8d5567-jbs5x\" (UID: \"fe4d01cb-1457-4cca-b2ed-7da6250a47df\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.503193 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdjk\" (UniqueName: \"kubernetes.io/projected/c9221095-3450-45f9-9aa2-e4994c8471ef-kube-api-access-btdjk\") pod \"manila-operator-controller-manager-56cf9c6b99-r8wxg\" (UID: \"c9221095-3450-45f9-9aa2-e4994c8471ef\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.503219 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbn7\" (UniqueName: \"kubernetes.io/projected/9c51299d-7ce3-4dff-b555-8cc2bcee6e4c-kube-api-access-mcbn7\") pod \"mariadb-operator-controller-manager-687b9cf756-b69kd\" (UID: \"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:15 crc kubenswrapper[4922]: E0929 09:58:15.510402 4922 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 29 09:58:15 crc kubenswrapper[4922]: E0929 09:58:15.510573 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert podName:698d9305-27b8-44fe-bcd9-f034bdfa9b09 nodeName:}" failed. No retries permitted until 2025-09-29 09:58:16.010529761 +0000 UTC m=+821.376760025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert") pod "infra-operator-controller-manager-858cd69f49-8pnnw" (UID: "698d9305-27b8-44fe-bcd9-f034bdfa9b09") : secret "infra-operator-webhook-server-cert" not found Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.541408 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5f5\" (UniqueName: \"kubernetes.io/projected/fe4d01cb-1457-4cca-b2ed-7da6250a47df-kube-api-access-mc5f5\") pod \"ironic-operator-controller-manager-9fc8d5567-jbs5x\" (UID: \"fe4d01cb-1457-4cca-b2ed-7da6250a47df\") " pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.546096 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.546563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbbt\" (UniqueName: \"kubernetes.io/projected/698d9305-27b8-44fe-bcd9-f034bdfa9b09-kube-api-access-6fbbt\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.548418 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.568268 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkc6\" (UniqueName: \"kubernetes.io/projected/39c6dedb-23e2-4515-83c8-1e85e0136cc8-kube-api-access-rgkc6\") pod \"keystone-operator-controller-manager-7bf498966c-xns9g\" (UID: \"39c6dedb-23e2-4515-83c8-1e85e0136cc8\") " pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.581677 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.581740 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-pnns5"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.587212 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.587484 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.590386 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.594803 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.606363 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-m6w7r" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.619731 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdjk\" (UniqueName: \"kubernetes.io/projected/c9221095-3450-45f9-9aa2-e4994c8471ef-kube-api-access-btdjk\") pod \"manila-operator-controller-manager-56cf9c6b99-r8wxg\" (UID: \"c9221095-3450-45f9-9aa2-e4994c8471ef\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.620320 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbn7\" (UniqueName: \"kubernetes.io/projected/9c51299d-7ce3-4dff-b555-8cc2bcee6e4c-kube-api-access-mcbn7\") pod \"mariadb-operator-controller-manager-687b9cf756-b69kd\" (UID: \"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.620363 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5dr\" (UniqueName: \"kubernetes.io/projected/60630351-afcc-4792-bb16-5994368117cd-kube-api-access-8r5dr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-5wnjj\" (UID: \"60630351-afcc-4792-bb16-5994368117cd\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.620507 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhrmh\" (UniqueName: \"kubernetes.io/projected/a01ec1f8-817f-4ed8-9431-01847d4956be-kube-api-access-fhrmh\") pod \"nova-operator-controller-manager-c7c776c96-knzwz\" (UID: \"a01ec1f8-817f-4ed8-9431-01847d4956be\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.618515 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-54ldg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.620563 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxb55\" (UniqueName: \"kubernetes.io/projected/8ad2a8b0-1e70-47e2-80a1-139eedb15541-kube-api-access-mxb55\") pod \"neutron-operator-controller-manager-54d766c9f9-r4lcr\" (UID: \"8ad2a8b0-1e70-47e2-80a1-139eedb15541\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.667172 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.669248 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdjk\" (UniqueName: \"kubernetes.io/projected/c9221095-3450-45f9-9aa2-e4994c8471ef-kube-api-access-btdjk\") pod \"manila-operator-controller-manager-56cf9c6b99-r8wxg\" (UID: \"c9221095-3450-45f9-9aa2-e4994c8471ef\") " pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.669348 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.669478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxb55\" (UniqueName: \"kubernetes.io/projected/8ad2a8b0-1e70-47e2-80a1-139eedb15541-kube-api-access-mxb55\") pod \"neutron-operator-controller-manager-54d766c9f9-r4lcr\" (UID: \"8ad2a8b0-1e70-47e2-80a1-139eedb15541\") " pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.672724 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbn7\" (UniqueName: \"kubernetes.io/projected/9c51299d-7ce3-4dff-b555-8cc2bcee6e4c-kube-api-access-mcbn7\") pod \"mariadb-operator-controller-manager-687b9cf756-b69kd\" (UID: \"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c\") " pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.673265 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nkgnq" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.679512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhrmh\" (UniqueName: \"kubernetes.io/projected/a01ec1f8-817f-4ed8-9431-01847d4956be-kube-api-access-fhrmh\") pod \"nova-operator-controller-manager-c7c776c96-knzwz\" (UID: \"a01ec1f8-817f-4ed8-9431-01847d4956be\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.680009 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.700962 4922 generic.go:334] "Generic (PLEG): container finished" podID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerID="8c3149dc3b2ffe2db16e3c4b4b6aefb2908de1f36d4b4edcde1600ecb9096c66" exitCode=0 Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.703621 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerDied","Data":"8c3149dc3b2ffe2db16e3c4b4b6aefb2908de1f36d4b4edcde1600ecb9096c66"} Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.717030 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.718363 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.724819 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-c2bbq" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.725260 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.728907 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5dr\" (UniqueName: \"kubernetes.io/projected/60630351-afcc-4792-bb16-5994368117cd-kube-api-access-8r5dr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-5wnjj\" (UID: \"60630351-afcc-4792-bb16-5994368117cd\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.729163 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqhm\" (UniqueName: \"kubernetes.io/projected/c38d04c4-b717-4155-b646-b06c3dac3386-kube-api-access-qxqhm\") pod \"placement-operator-controller-manager-774b97b48-pnns5\" (UID: \"c38d04c4-b717-4155-b646-b06c3dac3386\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.729227 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjphh\" (UniqueName: \"kubernetes.io/projected/c4ba5f8a-ca61-4870-bc8e-017e79e139a5-kube-api-access-rjphh\") pod \"ovn-operator-controller-manager-5f95c46c78-rvj4d\" (UID: \"c4ba5f8a-ca61-4870-bc8e-017e79e139a5\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.736079 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.752105 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.786489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5dr\" (UniqueName: \"kubernetes.io/projected/60630351-afcc-4792-bb16-5994368117cd-kube-api-access-8r5dr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-5wnjj\" (UID: \"60630351-afcc-4792-bb16-5994368117cd\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.801338 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-pnns5"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.801787 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.810060 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.833602 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.835194 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqhm\" (UniqueName: \"kubernetes.io/projected/c38d04c4-b717-4155-b646-b06c3dac3386-kube-api-access-qxqhm\") pod \"placement-operator-controller-manager-774b97b48-pnns5\" (UID: \"c38d04c4-b717-4155-b646-b06c3dac3386\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.835284 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2twqt\" (UniqueName: \"kubernetes.io/projected/71aa3678-e2c4-4a23-9e66-738fddb6066f-kube-api-access-2twqt\") pod \"swift-operator-controller-manager-bc7dc7bd9-82g7r\" (UID: \"71aa3678-e2c4-4a23-9e66-738fddb6066f\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.835309 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjphh\" (UniqueName: \"kubernetes.io/projected/c4ba5f8a-ca61-4870-bc8e-017e79e139a5-kube-api-access-rjphh\") pod \"ovn-operator-controller-manager-5f95c46c78-rvj4d\" (UID: \"c4ba5f8a-ca61-4870-bc8e-017e79e139a5\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.835385 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.835445 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjp5t\" (UniqueName: \"kubernetes.io/projected/2bbc35ab-8adc-445e-bc17-690ce9533a3e-kube-api-access-bjp5t\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.842371 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.860081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjphh\" (UniqueName: \"kubernetes.io/projected/c4ba5f8a-ca61-4870-bc8e-017e79e139a5-kube-api-access-rjphh\") pod \"ovn-operator-controller-manager-5f95c46c78-rvj4d\" (UID: \"c4ba5f8a-ca61-4870-bc8e-017e79e139a5\") " pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.866120 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.867371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqhm\" (UniqueName: \"kubernetes.io/projected/c38d04c4-b717-4155-b646-b06c3dac3386-kube-api-access-qxqhm\") pod \"placement-operator-controller-manager-774b97b48-pnns5\" (UID: \"c38d04c4-b717-4155-b646-b06c3dac3386\") " pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.870768 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.876496 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.876675 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.880408 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9gm7k" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.880667 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.885269 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.887811 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.889837 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4m9qc" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.901990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.914578 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.917711 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.919897 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.924118 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jjvk6" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.924285 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t"] Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.938293 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2twqt\" (UniqueName: \"kubernetes.io/projected/71aa3678-e2c4-4a23-9e66-738fddb6066f-kube-api-access-2twqt\") pod \"swift-operator-controller-manager-bc7dc7bd9-82g7r\" (UID: \"71aa3678-e2c4-4a23-9e66-738fddb6066f\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.938366 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.938415 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjp5t\" (UniqueName: \"kubernetes.io/projected/2bbc35ab-8adc-445e-bc17-690ce9533a3e-kube-api-access-bjp5t\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:15 crc kubenswrapper[4922]: E0929 09:58:15.939028 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 09:58:15 crc kubenswrapper[4922]: E0929 09:58:15.939134 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert podName:2bbc35ab-8adc-445e-bc17-690ce9533a3e nodeName:}" failed. No retries permitted until 2025-09-29 09:58:16.439100405 +0000 UTC m=+821.805330669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-p2hsw" (UID: "2bbc35ab-8adc-445e-bc17-690ce9533a3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.970397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:15 crc kubenswrapper[4922]: I0929 09:58:15.997557 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.005048 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjp5t\" (UniqueName: \"kubernetes.io/projected/2bbc35ab-8adc-445e-bc17-690ce9533a3e-kube-api-access-bjp5t\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.006822 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2twqt\" (UniqueName: \"kubernetes.io/projected/71aa3678-e2c4-4a23-9e66-738fddb6066f-kube-api-access-2twqt\") pod \"swift-operator-controller-manager-bc7dc7bd9-82g7r\" (UID: \"71aa3678-e2c4-4a23-9e66-738fddb6066f\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.043572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdk5\" (UniqueName: \"kubernetes.io/projected/0d5d466f-e41b-42c4-91ff-11e84d297b5d-kube-api-access-lsdk5\") pod \"watcher-operator-controller-manager-76669f99c-lcd9t\" (UID: \"0d5d466f-e41b-42c4-91ff-11e84d297b5d\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.043635 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qftk7\" (UniqueName: \"kubernetes.io/projected/f4fbefa3-c5d4-4a51-b90a-512ebfcef863-kube-api-access-qftk7\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-5s5wf\" (UID: \"f4fbefa3-c5d4-4a51-b90a-512ebfcef863\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.043684 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.043750 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5d5r\" (UniqueName: \"kubernetes.io/projected/97df5e99-5243-4552-ab72-7c6526deea11-kube-api-access-h5d5r\") pod \"test-operator-controller-manager-f66b554c6-lfh4h\" (UID: \"97df5e99-5243-4552-ab72-7c6526deea11\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.053665 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.055124 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.061466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/698d9305-27b8-44fe-bcd9-f034bdfa9b09-cert\") pod \"infra-operator-controller-manager-858cd69f49-8pnnw\" (UID: \"698d9305-27b8-44fe-bcd9-f034bdfa9b09\") " pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.068194 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.068342 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-trcfg" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.068595 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.142288 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.145424 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5d5r\" (UniqueName: \"kubernetes.io/projected/97df5e99-5243-4552-ab72-7c6526deea11-kube-api-access-h5d5r\") pod \"test-operator-controller-manager-f66b554c6-lfh4h\" (UID: \"97df5e99-5243-4552-ab72-7c6526deea11\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.145494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdk5\" (UniqueName: \"kubernetes.io/projected/0d5d466f-e41b-42c4-91ff-11e84d297b5d-kube-api-access-lsdk5\") pod \"watcher-operator-controller-manager-76669f99c-lcd9t\" (UID: \"0d5d466f-e41b-42c4-91ff-11e84d297b5d\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.145525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qftk7\" (UniqueName: \"kubernetes.io/projected/f4fbefa3-c5d4-4a51-b90a-512ebfcef863-kube-api-access-qftk7\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-5s5wf\" (UID: \"f4fbefa3-c5d4-4a51-b90a-512ebfcef863\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.145560 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5twk\" (UniqueName: \"kubernetes.io/projected/00b67606-b7e3-4043-abff-cae6f14ba095-kube-api-access-m5twk\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.145615 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.158382 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.221430 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.229036 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.234338 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fxbm4" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.246895 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdk5\" (UniqueName: \"kubernetes.io/projected/0d5d466f-e41b-42c4-91ff-11e84d297b5d-kube-api-access-lsdk5\") pod \"watcher-operator-controller-manager-76669f99c-lcd9t\" (UID: \"0d5d466f-e41b-42c4-91ff-11e84d297b5d\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.248369 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5twk\" (UniqueName: \"kubernetes.io/projected/00b67606-b7e3-4043-abff-cae6f14ba095-kube-api-access-m5twk\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.248444 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: E0929 09:58:16.248665 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 29 09:58:16 crc kubenswrapper[4922]: E0929 09:58:16.248722 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert podName:00b67606-b7e3-4043-abff-cae6f14ba095 nodeName:}" failed. No retries permitted until 2025-09-29 09:58:16.748704322 +0000 UTC m=+822.114934586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert") pod "openstack-operator-controller-manager-8d8dc476c-zjv29" (UID: "00b67606-b7e3-4043-abff-cae6f14ba095") : secret "webhook-server-cert" not found Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.251894 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.259480 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qftk7\" (UniqueName: \"kubernetes.io/projected/f4fbefa3-c5d4-4a51-b90a-512ebfcef863-kube-api-access-qftk7\") pod \"telemetry-operator-controller-manager-5bf96cfbc4-5s5wf\" (UID: \"f4fbefa3-c5d4-4a51-b90a-512ebfcef863\") " pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.260508 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.281898 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5d5r\" (UniqueName: \"kubernetes.io/projected/97df5e99-5243-4552-ab72-7c6526deea11-kube-api-access-h5d5r\") pod \"test-operator-controller-manager-f66b554c6-lfh4h\" (UID: \"97df5e99-5243-4552-ab72-7c6526deea11\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.286346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5twk\" (UniqueName: \"kubernetes.io/projected/00b67606-b7e3-4043-abff-cae6f14ba095-kube-api-access-m5twk\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.319647 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.353713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j66lq\" (UniqueName: \"kubernetes.io/projected/c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3-kube-api-access-j66lq\") pod \"rabbitmq-cluster-operator-manager-79d8469568-lspp8\" (UID: \"c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.412364 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.457000 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.457330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j66lq\" (UniqueName: \"kubernetes.io/projected/c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3-kube-api-access-j66lq\") pod \"rabbitmq-cluster-operator-manager-79d8469568-lspp8\" (UID: \"c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.462579 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bbc35ab-8adc-445e-bc17-690ce9533a3e-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-p2hsw\" (UID: \"2bbc35ab-8adc-445e-bc17-690ce9533a3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.463860 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.475520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j66lq\" (UniqueName: \"kubernetes.io/projected/c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3-kube-api-access-j66lq\") pod \"rabbitmq-cluster-operator-manager-79d8469568-lspp8\" (UID: \"c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.477137 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.509577 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.526465 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.621644 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.640835 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q"] Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.727616 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" event={"ID":"733a9696-fd92-42d4-b4df-6e4ba3d9d433","Type":"ContainerStarted","Data":"b9bd993b590cb5cfcc1a9711cbc457f3785231e01160d58212f9fa1e9f44f0b8"} Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.734991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" event={"ID":"05dca5c7-0856-4c86-9bf8-99c6edc07252","Type":"ContainerStarted","Data":"a18d6206a1a0344bf59b2e568383045dfbcdb5bdc9811ac5c01588885b6d2a5c"} Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.736526 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mlplc" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="registry-server" containerID="cri-o://c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956" gracePeriod=2 Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.740494 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" event={"ID":"88c2443d-e9bf-441b-ae76-93b7f63c790b","Type":"ContainerStarted","Data":"333e949a5e68aec84fbe0c265e270a67f09b1a041a8e4c52ca9854474298ddd3"} Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.763379 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.770909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b67606-b7e3-4043-abff-cae6f14ba095-cert\") pod \"openstack-operator-controller-manager-8d8dc476c-zjv29\" (UID: \"00b67606-b7e3-4043-abff-cae6f14ba095\") " pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:16 crc kubenswrapper[4922]: I0929 09:58:16.886136 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.021069 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.031003 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.055881 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.325666 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.338585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.347666 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.377881 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.399235 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.404982 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk"] Sep 29 09:58:17 crc kubenswrapper[4922]: W0929 09:58:17.410796 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c51299d_7ce3_4dff_b555_8cc2bcee6e4c.slice/crio-4ef92421e35d16a54a177bc9074104b6e4084b153d8ba553f1ee4987aa4d3472 WatchSource:0}: Error finding container 4ef92421e35d16a54a177bc9074104b6e4084b153d8ba553f1ee4987aa4d3472: Status 404 returned error can't find the container with id 4ef92421e35d16a54a177bc9074104b6e4084b153d8ba553f1ee4987aa4d3472 Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.482107 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.574066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content\") pod \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.574128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljn22\" (UniqueName: \"kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22\") pod \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.574155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities\") pod \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\" (UID: \"0b90a978-860b-46f3-a6c8-c7da96bcab3c\") " Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.575501 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities" (OuterVolumeSpecName: "utilities") pod "0b90a978-860b-46f3-a6c8-c7da96bcab3c" (UID: "0b90a978-860b-46f3-a6c8-c7da96bcab3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.589610 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22" (OuterVolumeSpecName: "kube-api-access-ljn22") pod "0b90a978-860b-46f3-a6c8-c7da96bcab3c" (UID: "0b90a978-860b-46f3-a6c8-c7da96bcab3c"). InnerVolumeSpecName "kube-api-access-ljn22". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.597463 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b90a978-860b-46f3-a6c8-c7da96bcab3c" (UID: "0b90a978-860b-46f3-a6c8-c7da96bcab3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.638270 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d"] Sep 29 09:58:17 crc kubenswrapper[4922]: W0929 09:58:17.658588 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9221095_3450_45f9_9aa2_e4994c8471ef.slice/crio-4c3d7fe689b12863635015cd09e25a0ea5df9a581d833d3762d5f89f744ad00d WatchSource:0}: Error finding container 4c3d7fe689b12863635015cd09e25a0ea5df9a581d833d3762d5f89f744ad00d: Status 404 returned error can't find the container with id 4c3d7fe689b12863635015cd09e25a0ea5df9a581d833d3762d5f89f744ad00d Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.658658 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.667790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.672946 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-774b97b48-pnns5"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.677538 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.677770 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.677790 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljn22\" (UniqueName: \"kubernetes.io/projected/0b90a978-860b-46f3-a6c8-c7da96bcab3c-kube-api-access-ljn22\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.677802 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b90a978-860b-46f3-a6c8-c7da96bcab3c-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.682869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw"] Sep 29 09:58:17 crc kubenswrapper[4922]: W0929 09:58:17.695772 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ba5f8a_ca61_4870_bc8e_017e79e139a5.slice/crio-0da0c0c6d77587b73d45eb249bf9f03c385cc903c950607830779f5213063e2b WatchSource:0}: Error finding container 0da0c0c6d77587b73d45eb249bf9f03c385cc903c950607830779f5213063e2b: Status 404 returned error can't find the container with id 0da0c0c6d77587b73d45eb249bf9f03c385cc903c950607830779f5213063e2b Sep 29 09:58:17 crc kubenswrapper[4922]: W0929 09:58:17.700227 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698d9305_27b8_44fe_bcd9_f034bdfa9b09.slice/crio-c0763f218114f903703b8b4fb32c6b20722bd00b2ad04dadbac84e673f203e67 WatchSource:0}: Error finding container c0763f218114f903703b8b4fb32c6b20722bd00b2ad04dadbac84e673f203e67: Status 404 returned error can't find the container with id c0763f218114f903703b8b4fb32c6b20722bd00b2ad04dadbac84e673f203e67 Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.703272 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw"] Sep 29 09:58:17 crc kubenswrapper[4922]: E0929 09:58:17.706385 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:87a522d480797f54499bcd1c4a48837e1b17c33d4cc43e99ed7a53b8cedb17c7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fbbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-858cd69f49-8pnnw_openstack-operators(698d9305-27b8-44fe-bcd9-f034bdfa9b09): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 09:58:17 crc kubenswrapper[4922]: E0929 09:58:17.736007 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lsdk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-lcd9t_openstack-operators(0d5d466f-e41b-42c4-91ff-11e84d297b5d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 09:58:17 crc kubenswrapper[4922]: W0929 09:58:17.736540 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbc35ab_8adc_445e_bc17_690ce9533a3e.slice/crio-1abba50e9543756528a13d0a89816ed93d3038497c6a1df4f3c3f581e7a71c1a WatchSource:0}: Error finding container 1abba50e9543756528a13d0a89816ed93d3038497c6a1df4f3c3f581e7a71c1a: Status 404 returned error can't find the container with id 1abba50e9543756528a13d0a89816ed93d3038497c6a1df4f3c3f581e7a71c1a Sep 29 09:58:17 crc kubenswrapper[4922]: E0929 09:58:17.757401 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjp5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-p2hsw_openstack-operators(2bbc35ab-8adc-445e-bc17-690ce9533a3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.759421 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" event={"ID":"2536f9c0-aac9-4d2c-be19-8afe9ac2e418","Type":"ContainerStarted","Data":"19dad4a2437d58b5b7c64dc9442d73bfd8998e43ab0a0d5a95d94ff5fc1f8029"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.762432 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" event={"ID":"60630351-afcc-4792-bb16-5994368117cd","Type":"ContainerStarted","Data":"56e76a4f6cc064bb9620e69ccf2cc8da1afa834f1f5ac52ea03accca4d90f266"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.768730 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" event={"ID":"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0","Type":"ContainerStarted","Data":"946e453648bd6f27b53e3d2bdcff06477b55d71ee86bb202decada0f7b28dadd"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.770753 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" event={"ID":"c4ba5f8a-ca61-4870-bc8e-017e79e139a5","Type":"ContainerStarted","Data":"0da0c0c6d77587b73d45eb249bf9f03c385cc903c950607830779f5213063e2b"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.773460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" event={"ID":"c9221095-3450-45f9-9aa2-e4994c8471ef","Type":"ContainerStarted","Data":"4c3d7fe689b12863635015cd09e25a0ea5df9a581d833d3762d5f89f744ad00d"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.793149 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" event={"ID":"0ed6eee8-7938-4f36-98f8-99af2cc40a4e","Type":"ContainerStarted","Data":"5d0e9370c0d921b6d43346a6ee64944f934ea11d472f77e036d0f223f53ccdfa"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.794702 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" event={"ID":"698d9305-27b8-44fe-bcd9-f034bdfa9b09","Type":"ContainerStarted","Data":"c0763f218114f903703b8b4fb32c6b20722bd00b2ad04dadbac84e673f203e67"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.797535 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" event={"ID":"c38d04c4-b717-4155-b646-b06c3dac3386","Type":"ContainerStarted","Data":"5d0bf75d90238ee2b024379ef165f415b2872944a94f5db709b4f63ffee9bb02"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.799398 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" event={"ID":"0d5d466f-e41b-42c4-91ff-11e84d297b5d","Type":"ContainerStarted","Data":"8246c06f4ff91e941a2c645cae8ff34a01e7cab8a2925b9410d500998a463d9e"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.801015 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" event={"ID":"2bbc35ab-8adc-445e-bc17-690ce9533a3e","Type":"ContainerStarted","Data":"1abba50e9543756528a13d0a89816ed93d3038497c6a1df4f3c3f581e7a71c1a"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.804437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" event={"ID":"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c","Type":"ContainerStarted","Data":"4ef92421e35d16a54a177bc9074104b6e4084b153d8ba553f1ee4987aa4d3472"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.806076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" event={"ID":"39c6dedb-23e2-4515-83c8-1e85e0136cc8","Type":"ContainerStarted","Data":"8f75f2c4656a1675602924bccc159d3e649ffafece497447aad07bf0af34060f"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.825884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" event={"ID":"71aa3678-e2c4-4a23-9e66-738fddb6066f","Type":"ContainerStarted","Data":"c0319d8a13c5e14cfc4031b7ef328c264dd094b77bf5b287208289d58038372f"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.831965 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" event={"ID":"a01ec1f8-817f-4ed8-9431-01847d4956be","Type":"ContainerStarted","Data":"9c4c6b4eff41d1fce83a8b56cc83474c7da94f5ec02a1faacf1b8070d90e0b69"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.833788 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" event={"ID":"8ad2a8b0-1e70-47e2-80a1-139eedb15541","Type":"ContainerStarted","Data":"9a964681d49fec80241de42a8ece6c775434fcf65879bc6d4a17dec454bf46a2"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.839181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerStarted","Data":"425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.842119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" event={"ID":"fe4d01cb-1457-4cca-b2ed-7da6250a47df","Type":"ContainerStarted","Data":"1b0d2a79b972fe6de2e4bc4f2ef79defb2ced390b622c74f47969706ce19bc9c"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.847492 4922 generic.go:334] "Generic (PLEG): container finished" podID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerID="c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956" exitCode=0 Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.847538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerDied","Data":"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.847585 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlplc" event={"ID":"0b90a978-860b-46f3-a6c8-c7da96bcab3c","Type":"ContainerDied","Data":"afa5edcda38522ede53cccaf7805f312313b60765714c570c937f1f950f5ffee"} Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.847608 4922 scope.go:117] "RemoveContainer" containerID="c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.849752 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlplc" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.883375 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5hprb" podStartSLOduration=3.788981622 podStartE2EDuration="6.883350394s" podCreationTimestamp="2025-09-29 09:58:11 +0000 UTC" firstStartedPulling="2025-09-29 09:58:13.616989102 +0000 UTC m=+818.983219366" lastFinishedPulling="2025-09-29 09:58:16.711357874 +0000 UTC m=+822.077588138" observedRunningTime="2025-09-29 09:58:17.873139145 +0000 UTC m=+823.239369409" watchObservedRunningTime="2025-09-29 09:58:17.883350394 +0000 UTC m=+823.249580658" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.959227 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.963175 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.968549 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.974347 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlplc"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.979931 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf"] Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.981871 4922 scope.go:117] "RemoveContainer" containerID="ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f" Sep 29 09:58:17 crc kubenswrapper[4922]: I0929 09:58:17.990095 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29"] Sep 29 09:58:18 crc kubenswrapper[4922]: W0929 09:58:18.001489 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97df5e99_5243_4552_ab72_7c6526deea11.slice/crio-edae180c237b038b6148fb2cf837a40914737f613ade6bdc43f8bcc60a2ad450 WatchSource:0}: Error finding container edae180c237b038b6148fb2cf837a40914737f613ade6bdc43f8bcc60a2ad450: Status 404 returned error can't find the container with id edae180c237b038b6148fb2cf837a40914737f613ade6bdc43f8bcc60a2ad450 Sep 29 09:58:18 crc kubenswrapper[4922]: W0929 09:58:18.001805 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fbefa3_c5d4_4a51_b90a_512ebfcef863.slice/crio-bad703afa6eed5e54cd86d364244f1f3b71786d88f7ccda929acf774120df0d6 WatchSource:0}: Error finding container bad703afa6eed5e54cd86d364244f1f3b71786d88f7ccda929acf774120df0d6: Status 404 returned error can't find the container with id bad703afa6eed5e54cd86d364244f1f3b71786d88f7ccda929acf774120df0d6 Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.008215 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" podUID="698d9305-27b8-44fe-bcd9-f034bdfa9b09" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.018963 4922 scope.go:117] "RemoveContainer" containerID="6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.020531 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5d5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-lfh4h_openstack-operators(97df5e99-5243-4552-ab72-7c6526deea11): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.089577 4922 scope.go:117] "RemoveContainer" containerID="c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.090694 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956\": container with ID starting with c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956 not found: ID does not exist" containerID="c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.090736 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956"} err="failed to get container status \"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956\": rpc error: code = NotFound desc = could not find container \"c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956\": container with ID starting with c36c3c2328600e6d21210076a6cb33b626892460945ff1a8be9f6c7ee2c17956 not found: ID does not exist" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.090761 4922 scope.go:117] "RemoveContainer" containerID="ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.096690 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f\": container with ID starting with ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f not found: ID does not exist" containerID="ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.096720 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f"} err="failed to get container status \"ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f\": rpc error: code = NotFound desc = could not find container \"ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f\": container with ID starting with ea14c1d1d02d9e34fc50055a08679c71b7814c0344ffe0d0286b2f0aabfcea1f not found: ID does not exist" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.096738 4922 scope.go:117] "RemoveContainer" containerID="6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.097993 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" podUID="0d5d466f-e41b-42c4-91ff-11e84d297b5d" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.098093 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618\": container with ID starting with 6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618 not found: ID does not exist" containerID="6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.098115 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618"} err="failed to get container status \"6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618\": rpc error: code = NotFound desc = could not find container \"6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618\": container with ID starting with 6abc94956b6f254c6f6f0f1654986fd6c38b5f2753623b3c8ec917258e428618 not found: ID does not exist" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.221406 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" podUID="2bbc35ab-8adc-445e-bc17-690ce9533a3e" Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.253587 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" podUID="97df5e99-5243-4552-ab72-7c6526deea11" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.912564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" event={"ID":"00b67606-b7e3-4043-abff-cae6f14ba095","Type":"ContainerStarted","Data":"cc9d9d4b69df4eee770abe41ef33abc0fe36d20a91ab74396423b7a57e38803b"} Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.912646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" event={"ID":"00b67606-b7e3-4043-abff-cae6f14ba095","Type":"ContainerStarted","Data":"b0498ea86aeefd6b7b04843d78ee11ae3a51421668a0f8a8c57bcb85cc57ec2a"} Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.912658 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" event={"ID":"00b67606-b7e3-4043-abff-cae6f14ba095","Type":"ContainerStarted","Data":"b04408dca78d59b75e5d14372614df7efe8200f23093f161554e6e2197821242"} Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.916216 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.920100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" event={"ID":"2bbc35ab-8adc-445e-bc17-690ce9533a3e","Type":"ContainerStarted","Data":"8ab2361ff409a864faf548110c3327cb47bbafb308a72a77849f1b850cade4fe"} Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.926223 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" podUID="2bbc35ab-8adc-445e-bc17-690ce9533a3e" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.931757 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" event={"ID":"97df5e99-5243-4552-ab72-7c6526deea11","Type":"ContainerStarted","Data":"03a2abddc1be266dfdc6d15ed8cc90364debdd10a2f5b3becf99040eb978b5fa"} Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.931798 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" event={"ID":"97df5e99-5243-4552-ab72-7c6526deea11","Type":"ContainerStarted","Data":"edae180c237b038b6148fb2cf837a40914737f613ade6bdc43f8bcc60a2ad450"} Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.934527 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" podUID="97df5e99-5243-4552-ab72-7c6526deea11" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.947651 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" podStartSLOduration=3.94762457 podStartE2EDuration="3.94762457s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 09:58:18.943591581 +0000 UTC m=+824.309821845" watchObservedRunningTime="2025-09-29 09:58:18.94762457 +0000 UTC m=+824.313854834" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.960686 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" event={"ID":"f4fbefa3-c5d4-4a51-b90a-512ebfcef863","Type":"ContainerStarted","Data":"bad703afa6eed5e54cd86d364244f1f3b71786d88f7ccda929acf774120df0d6"} Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.963749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" event={"ID":"698d9305-27b8-44fe-bcd9-f034bdfa9b09","Type":"ContainerStarted","Data":"6e6ba80429920dc2a51936adaad61997bccdb13fe9fa89e259eb51eb2cc77351"} Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.975524 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:87a522d480797f54499bcd1c4a48837e1b17c33d4cc43e99ed7a53b8cedb17c7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" podUID="698d9305-27b8-44fe-bcd9-f034bdfa9b09" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.975905 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" event={"ID":"0d5d466f-e41b-42c4-91ff-11e84d297b5d","Type":"ContainerStarted","Data":"275e0c8cce497ccf2420d2d4f5a03f4491497b5d8a059e0a6e234c0fc9ba13c4"} Sep 29 09:58:18 crc kubenswrapper[4922]: E0929 09:58:18.977568 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" podUID="0d5d466f-e41b-42c4-91ff-11e84d297b5d" Sep 29 09:58:18 crc kubenswrapper[4922]: I0929 09:58:18.978264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" event={"ID":"c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3","Type":"ContainerStarted","Data":"77c584a744b8877428793aecfb622a434b9548782ebc3be95158b30fd56a0ac0"} Sep 29 09:58:19 crc kubenswrapper[4922]: I0929 09:58:19.480607 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" path="/var/lib/kubelet/pods/0b90a978-860b-46f3-a6c8-c7da96bcab3c/volumes" Sep 29 09:58:19 crc kubenswrapper[4922]: E0929 09:58:19.989766 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" podUID="2bbc35ab-8adc-445e-bc17-690ce9533a3e" Sep 29 09:58:19 crc kubenswrapper[4922]: E0929 09:58:19.990263 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:87a522d480797f54499bcd1c4a48837e1b17c33d4cc43e99ed7a53b8cedb17c7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" podUID="698d9305-27b8-44fe-bcd9-f034bdfa9b09" Sep 29 09:58:19 crc kubenswrapper[4922]: E0929 09:58:19.992127 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" podUID="0d5d466f-e41b-42c4-91ff-11e84d297b5d" Sep 29 09:58:19 crc kubenswrapper[4922]: E0929 09:58:19.992714 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" podUID="97df5e99-5243-4552-ab72-7c6526deea11" Sep 29 09:58:22 crc kubenswrapper[4922]: I0929 09:58:22.004030 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:22 crc kubenswrapper[4922]: I0929 09:58:22.005279 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:22 crc kubenswrapper[4922]: I0929 09:58:22.089033 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:23 crc kubenswrapper[4922]: I0929 09:58:23.082461 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:24 crc kubenswrapper[4922]: I0929 09:58:24.061550 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:26 crc kubenswrapper[4922]: I0929 09:58:26.045349 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5hprb" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="registry-server" containerID="cri-o://425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" gracePeriod=2 Sep 29 09:58:26 crc kubenswrapper[4922]: E0929 09:58:26.160049 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d0ac0f_4be9_485a_8869_21d69d8f86b4.slice/crio-conmon-425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d0ac0f_4be9_485a_8869_21d69d8f86b4.slice/crio-425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69.scope\": RecentStats: unable to find data in memory cache]" Sep 29 09:58:26 crc kubenswrapper[4922]: I0929 09:58:26.896731 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8d8dc476c-zjv29" Sep 29 09:58:27 crc kubenswrapper[4922]: I0929 09:58:27.057165 4922 generic.go:334] "Generic (PLEG): container finished" podID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerID="425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" exitCode=0 Sep 29 09:58:27 crc kubenswrapper[4922]: I0929 09:58:27.057228 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerDied","Data":"425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69"} Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.005265 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69 is running failed: container process not found" containerID="425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.010321 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69 is running failed: container process not found" containerID="425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.011162 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69 is running failed: container process not found" containerID="425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.011504 4922 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-5hprb" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="registry-server" Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.579481 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:c1ea5a0c814923293bcc2f1c82c5aa7a25c3e65bfc63c9c3b81f88558e256d93" Sep 29 09:58:32 crc kubenswrapper[4922]: E0929 09:58:32.579908 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:c1ea5a0c814923293bcc2f1c82c5aa7a25c3e65bfc63c9c3b81f88558e256d93,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcbn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-687b9cf756-b69kd_openstack-operators(9c51299d-7ce3-4dff-b555-8cc2bcee6e4c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:33 crc kubenswrapper[4922]: E0929 09:58:33.385173 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:8a0e2fb898fc27998e03e0819729074753b3dccc7e5c79204033d2753f7522a8" Sep 29 09:58:33 crc kubenswrapper[4922]: E0929 09:58:33.385510 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:8a0e2fb898fc27998e03e0819729074753b3dccc7e5c79204033d2753f7522a8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxqhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-774b97b48-pnns5_openstack-operators(c38d04c4-b717-4155-b646-b06c3dac3386): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:33 crc kubenswrapper[4922]: E0929 09:58:33.890989 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:a7eacfe7657c55521404e56e90764896845837d62c6689b3e9485c65f99055f8" Sep 29 09:58:33 crc kubenswrapper[4922]: E0929 09:58:33.891981 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:a7eacfe7657c55521404e56e90764896845837d62c6689b3e9485c65f99055f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rgkc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7bf498966c-xns9g_openstack-operators(39c6dedb-23e2-4515-83c8-1e85e0136cc8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:34 crc kubenswrapper[4922]: E0929 09:58:34.329777 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4" Sep 29 09:58:34 crc kubenswrapper[4922]: E0929 09:58:34.329994 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rjphh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5f95c46c78-rvj4d_openstack-operators(c4ba5f8a-ca61-4870-bc8e-017e79e139a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:34 crc kubenswrapper[4922]: E0929 09:58:34.776193 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8" Sep 29 09:58:34 crc kubenswrapper[4922]: E0929 09:58:34.776445 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8r5dr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-5wnjj_openstack-operators(60630351-afcc-4792-bb16-5994368117cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:35 crc kubenswrapper[4922]: E0929 09:58:35.295919 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef" Sep 29 09:58:35 crc kubenswrapper[4922]: E0929 09:58:35.296170 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhrmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-c7c776c96-knzwz_openstack-operators(a01ec1f8-817f-4ed8-9431-01847d4956be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.641272 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.793008 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnhl\" (UniqueName: \"kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl\") pod \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.793101 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content\") pod \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.793191 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities\") pod \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\" (UID: \"99d0ac0f-4be9-485a-8869-21d69d8f86b4\") " Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.794591 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities" (OuterVolumeSpecName: "utilities") pod "99d0ac0f-4be9-485a-8869-21d69d8f86b4" (UID: "99d0ac0f-4be9-485a-8869-21d69d8f86b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.822299 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl" (OuterVolumeSpecName: "kube-api-access-8cnhl") pod "99d0ac0f-4be9-485a-8869-21d69d8f86b4" (UID: "99d0ac0f-4be9-485a-8869-21d69d8f86b4"). InnerVolumeSpecName "kube-api-access-8cnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.846759 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99d0ac0f-4be9-485a-8869-21d69d8f86b4" (UID: "99d0ac0f-4be9-485a-8869-21d69d8f86b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.894285 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.894330 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnhl\" (UniqueName: \"kubernetes.io/projected/99d0ac0f-4be9-485a-8869-21d69d8f86b4-kube-api-access-8cnhl\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:36 crc kubenswrapper[4922]: I0929 09:58:36.894343 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d0ac0f-4be9-485a-8869-21d69d8f86b4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.174018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hprb" event={"ID":"99d0ac0f-4be9-485a-8869-21d69d8f86b4","Type":"ContainerDied","Data":"86eeb67caa11f41e562042489c82e34e9871fccb4d46aee6244dd6c386ec6ba5"} Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.174091 4922 scope.go:117] "RemoveContainer" containerID="425377ffa234721d4bfafefd27801ffd6c8b014082d4da3e7303d7748bf4dc69" Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.174240 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hprb" Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.211970 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.217221 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5hprb"] Sep 29 09:58:37 crc kubenswrapper[4922]: I0929 09:58:37.463930 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" path="/var/lib/kubelet/pods/99d0ac0f-4be9-485a-8869-21d69d8f86b4/volumes" Sep 29 09:58:37 crc kubenswrapper[4922]: E0929 09:58:37.522046 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b" Sep 29 09:58:37 crc kubenswrapper[4922]: E0929 09:58:37.522362 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j66lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-lspp8_openstack-operators(c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:58:37 crc kubenswrapper[4922]: E0929 09:58:37.523694 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" podUID="c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3" Sep 29 09:58:38 crc kubenswrapper[4922]: E0929 09:58:38.184405 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" podUID="c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3" Sep 29 09:58:39 crc kubenswrapper[4922]: I0929 09:58:39.013306 4922 scope.go:117] "RemoveContainer" containerID="8c3149dc3b2ffe2db16e3c4b4b6aefb2908de1f36d4b4edcde1600ecb9096c66" Sep 29 09:58:39 crc kubenswrapper[4922]: I0929 09:58:39.187401 4922 scope.go:117] "RemoveContainer" containerID="8a132fc939899ae225f0897bd38ffb391130544853682cbec9229583c83bb40a" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.325243 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" podUID="9c51299d-7ce3-4dff-b555-8cc2bcee6e4c" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.387987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" podUID="c38d04c4-b717-4155-b646-b06c3dac3386" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.395078 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" podUID="39c6dedb-23e2-4515-83c8-1e85e0136cc8" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.444917 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" podUID="60630351-afcc-4792-bb16-5994368117cd" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.466251 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" podUID="a01ec1f8-817f-4ed8-9431-01847d4956be" Sep 29 09:58:39 crc kubenswrapper[4922]: E0929 09:58:39.482814 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" podUID="c4ba5f8a-ca61-4870-bc8e-017e79e139a5" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.223224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" event={"ID":"71aa3678-e2c4-4a23-9e66-738fddb6066f","Type":"ContainerStarted","Data":"48b5e081c2aced77c16e2418260e9e358ccad586f6dcc765e7a5da3e81d0eb64"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.224526 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" event={"ID":"8ad2a8b0-1e70-47e2-80a1-139eedb15541","Type":"ContainerStarted","Data":"ddd0bfc192ee25b7af08398a02b28cf576c8d4859f83ea298936ea2a72ab1d51"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.228044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" event={"ID":"05dca5c7-0856-4c86-9bf8-99c6edc07252","Type":"ContainerStarted","Data":"22c6724474a298586a2a4785031aeddd34f616c27404f06210984005e1b91d41"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.229123 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" event={"ID":"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c","Type":"ContainerStarted","Data":"16fc0659856a5e98a0946690beff1af2afb2c182f35410b7389999be49c5e067"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.232182 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" event={"ID":"733a9696-fd92-42d4-b4df-6e4ba3d9d433","Type":"ContainerStarted","Data":"c929c44a185b3491450eac99ce28e96c98f223cff3fd7f86348b73fd241f9a74"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.233234 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" event={"ID":"c38d04c4-b717-4155-b646-b06c3dac3386","Type":"ContainerStarted","Data":"3ffa6a728ed0713dd92382a8f56106fc24dfb2726668d31a2088fc104b2e2c3c"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.242519 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" event={"ID":"0ed6eee8-7938-4f36-98f8-99af2cc40a4e","Type":"ContainerStarted","Data":"d02de5da16883d948c59969e7dabeeb64ab1f2301ae6a550f2785c95e6150558"} Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.255118 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:8a0e2fb898fc27998e03e0819729074753b3dccc7e5c79204033d2753f7522a8\\\"\"" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" podUID="c38d04c4-b717-4155-b646-b06c3dac3386" Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.255710 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:c1ea5a0c814923293bcc2f1c82c5aa7a25c3e65bfc63c9c3b81f88558e256d93\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" podUID="9c51299d-7ce3-4dff-b555-8cc2bcee6e4c" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.264677 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" event={"ID":"2536f9c0-aac9-4d2c-be19-8afe9ac2e418","Type":"ContainerStarted","Data":"a4ea859f3eaba83fa0b391f4eadcb84696a4accce53919d2e56a49256dbf0c07"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.269525 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" event={"ID":"f4fbefa3-c5d4-4a51-b90a-512ebfcef863","Type":"ContainerStarted","Data":"f65a305cdda5267e7e55b5d9a10499ad0c180c30ad415dcd2b5f9a39498416cd"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.279256 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" event={"ID":"39c6dedb-23e2-4515-83c8-1e85e0136cc8","Type":"ContainerStarted","Data":"9029364a5602523f678fe57aa16e5db15a5ed8cc374a9495347ec18e9e56ed37"} Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.281283 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:a7eacfe7657c55521404e56e90764896845837d62c6689b3e9485c65f99055f8\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" podUID="39c6dedb-23e2-4515-83c8-1e85e0136cc8" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.290513 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" event={"ID":"c4ba5f8a-ca61-4870-bc8e-017e79e139a5","Type":"ContainerStarted","Data":"a010ee2907fe1dfc5fe98bd2a3d86e9797ff31bbeee8f9e3e6e0ff8688c51e80"} Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.292668 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" podUID="c4ba5f8a-ca61-4870-bc8e-017e79e139a5" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.308943 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" event={"ID":"c9221095-3450-45f9-9aa2-e4994c8471ef","Type":"ContainerStarted","Data":"dd0ab470a9ddc427d2dc7d7d33f1ecddeeeca925c104f11127f94b9936ff2371"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.324640 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" event={"ID":"60630351-afcc-4792-bb16-5994368117cd","Type":"ContainerStarted","Data":"d4973030e9e5dc1413d2d8d6b3d369b2c344951b42686275124a17bf5bc333a9"} Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.327230 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" podUID="60630351-afcc-4792-bb16-5994368117cd" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.341915 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" event={"ID":"a01ec1f8-817f-4ed8-9431-01847d4956be","Type":"ContainerStarted","Data":"fbfe42d82d54cfdb2cecde854814a47a67b8a74090ae4e3a66569250eb2dd831"} Sep 29 09:58:40 crc kubenswrapper[4922]: E0929 09:58:40.346752 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" podUID="a01ec1f8-817f-4ed8-9431-01847d4956be" Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.355170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" event={"ID":"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0","Type":"ContainerStarted","Data":"d318a162c23d8f5362c5b18ccf7bd15bf8651957e0b988b490b591c1e571bf6f"} Sep 29 09:58:40 crc kubenswrapper[4922]: I0929 09:58:40.369716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" event={"ID":"fe4d01cb-1457-4cca-b2ed-7da6250a47df","Type":"ContainerStarted","Data":"dbb0aa2d383a78f3f33671d7b201058ed6bb517342375e7d0796d27948d54a6d"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.378552 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" event={"ID":"2bbc35ab-8adc-445e-bc17-690ce9533a3e","Type":"ContainerStarted","Data":"e26d87c90b477aa745e6b20e799cc21d77ddfe63f7adf5d602114ca46758a9e6"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.379252 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.380531 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" event={"ID":"698d9305-27b8-44fe-bcd9-f034bdfa9b09","Type":"ContainerStarted","Data":"62a2b7ad99f9a01db8111a5b87e3009ad594ba3797bab9e9dba6cdec775cd9f7"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.380727 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.382938 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" event={"ID":"71aa3678-e2c4-4a23-9e66-738fddb6066f","Type":"ContainerStarted","Data":"8b8c7d29d4f6a201d7874eff5ea858fc56fb154339e995f3ecc660bbc2a0fae2"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.383110 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.385543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" event={"ID":"88c2443d-e9bf-441b-ae76-93b7f63c790b","Type":"ContainerStarted","Data":"9a4cb8070183d4656a9bac78276c45a2d90e3378841c4496c7835c3bde00e40b"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.385602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" event={"ID":"88c2443d-e9bf-441b-ae76-93b7f63c790b","Type":"ContainerStarted","Data":"4ec87f1f9858cde6087f5c4f952f248df910d52d090dff2f6f68532afe432782"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.385674 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.388028 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" event={"ID":"05dca5c7-0856-4c86-9bf8-99c6edc07252","Type":"ContainerStarted","Data":"fbdf88a7f2282b3f53fc432d1e5c55498ad363b5de175426cd3c475b598c0ed6"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.388191 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.390322 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" event={"ID":"fe4d01cb-1457-4cca-b2ed-7da6250a47df","Type":"ContainerStarted","Data":"179967debdcdf82142b547ecb2ef37af6b17f4521038d53bf54832cf32b9a18c"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.390399 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.392260 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" event={"ID":"733a9696-fd92-42d4-b4df-6e4ba3d9d433","Type":"ContainerStarted","Data":"929fa2f8424f5bf7298d622ec823680d959aaa17a2cc88fc8a2834acf53926ce"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.392402 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.394268 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" event={"ID":"0ed6eee8-7938-4f36-98f8-99af2cc40a4e","Type":"ContainerStarted","Data":"a9c832f4ba7bfe68808012fb0cda309566b1d6f43c978581a7b8eeb43caa3999"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.394407 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.396235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" event={"ID":"f4fbefa3-c5d4-4a51-b90a-512ebfcef863","Type":"ContainerStarted","Data":"589e6c50789e80cef5f05714047dca9be9f6f1ca616c9336312502e2bd2839b5"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.396374 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.398685 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" event={"ID":"0d5d466f-e41b-42c4-91ff-11e84d297b5d","Type":"ContainerStarted","Data":"dcbcad4e7d49ace7b91187ad0c522f58398526b424c52b971e0c31e6c4ff356e"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.398932 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.401005 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" event={"ID":"aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0","Type":"ContainerStarted","Data":"10283ec3c158b7a5c7e968e5981f3765c5d642d37f825e0b1e1f0ab0d316e8da"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.401228 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.402931 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" event={"ID":"97df5e99-5243-4552-ab72-7c6526deea11","Type":"ContainerStarted","Data":"a0845a40b7026b64a811456bab9c2befec5160a0e0c60afe7bfd1ab879e0aced"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.403123 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.405215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" event={"ID":"c9221095-3450-45f9-9aa2-e4994c8471ef","Type":"ContainerStarted","Data":"2b4e3a0336d24d5540e777c31dc001ff95fe2fcc3fe36cee0083cfccdd403a7f"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.405352 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.407166 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" event={"ID":"2536f9c0-aac9-4d2c-be19-8afe9ac2e418","Type":"ContainerStarted","Data":"a216cc847c728069cd606e4113ecd4d4153d54d8a74e16003c8582ff7d9bc28b"} Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.407298 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.409017 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" event={"ID":"8ad2a8b0-1e70-47e2-80a1-139eedb15541","Type":"ContainerStarted","Data":"231e26fbf16915b49a5bf6dbc3bc92787478c517e963e331243d506f9f13e9a5"} Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.410538 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:a7eacfe7657c55521404e56e90764896845837d62c6689b3e9485c65f99055f8\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" podUID="39c6dedb-23e2-4515-83c8-1e85e0136cc8" Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.410556 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:26db59a990341558d29c00da7503b2c5b9a415db8cc04a0006f198f30ec016d4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" podUID="c4ba5f8a-ca61-4870-bc8e-017e79e139a5" Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.410664 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:8a0e2fb898fc27998e03e0819729074753b3dccc7e5c79204033d2753f7522a8\\\"\"" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" podUID="c38d04c4-b717-4155-b646-b06c3dac3386" Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.410885 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" podUID="a01ec1f8-817f-4ed8-9431-01847d4956be" Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.411667 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" podUID="60630351-afcc-4792-bb16-5994368117cd" Sep 29 09:58:41 crc kubenswrapper[4922]: E0929 09:58:41.411743 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:c1ea5a0c814923293bcc2f1c82c5aa7a25c3e65bfc63c9c3b81f88558e256d93\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" podUID="9c51299d-7ce3-4dff-b555-8cc2bcee6e4c" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.420418 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" podStartSLOduration=5.044601282 podStartE2EDuration="26.42040878s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.756677026 +0000 UTC m=+823.122907290" lastFinishedPulling="2025-09-29 09:58:39.132484524 +0000 UTC m=+844.498714788" observedRunningTime="2025-09-29 09:58:41.420125283 +0000 UTC m=+846.786355547" watchObservedRunningTime="2025-09-29 09:58:41.42040878 +0000 UTC m=+846.786639044" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.441395 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" podStartSLOduration=6.286173313 podStartE2EDuration="26.441369027s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.385982896 +0000 UTC m=+822.752213150" lastFinishedPulling="2025-09-29 09:58:37.5411786 +0000 UTC m=+842.907408864" observedRunningTime="2025-09-29 09:58:41.438970982 +0000 UTC m=+846.805201246" watchObservedRunningTime="2025-09-29 09:58:41.441369027 +0000 UTC m=+846.807599291" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.467910 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" podStartSLOduration=6.603962657 podStartE2EDuration="26.467880014s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.677463539 +0000 UTC m=+823.043693803" lastFinishedPulling="2025-09-29 09:58:37.541380896 +0000 UTC m=+842.907611160" observedRunningTime="2025-09-29 09:58:41.462056127 +0000 UTC m=+846.828286391" watchObservedRunningTime="2025-09-29 09:58:41.467880014 +0000 UTC m=+846.834110298" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.529377 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" podStartSLOduration=5.11936593 podStartE2EDuration="26.529354727s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.735769256 +0000 UTC m=+823.101999520" lastFinishedPulling="2025-09-29 09:58:39.145758053 +0000 UTC m=+844.511988317" observedRunningTime="2025-09-29 09:58:41.510018574 +0000 UTC m=+846.876248848" watchObservedRunningTime="2025-09-29 09:58:41.529354727 +0000 UTC m=+846.895585001" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.532491 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" podStartSLOduration=5.4864209 podStartE2EDuration="26.532479982s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:16.494028909 +0000 UTC m=+821.860259173" lastFinishedPulling="2025-09-29 09:58:37.540087991 +0000 UTC m=+842.906318255" observedRunningTime="2025-09-29 09:58:41.52908431 +0000 UTC m=+846.895314574" watchObservedRunningTime="2025-09-29 09:58:41.532479982 +0000 UTC m=+846.898710246" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.571311 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" podStartSLOduration=6.73177098 podStartE2EDuration="26.571292282s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.70061258 +0000 UTC m=+823.066842844" lastFinishedPulling="2025-09-29 09:58:37.540133872 +0000 UTC m=+842.906364146" observedRunningTime="2025-09-29 09:58:41.553365157 +0000 UTC m=+846.919595421" watchObservedRunningTime="2025-09-29 09:58:41.571292282 +0000 UTC m=+846.937522536" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.592045 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" podStartSLOduration=6.099637811 podStartE2EDuration="26.592021213s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.047761051 +0000 UTC m=+822.413991315" lastFinishedPulling="2025-09-29 09:58:37.540144453 +0000 UTC m=+842.906374717" observedRunningTime="2025-09-29 09:58:41.568968619 +0000 UTC m=+846.935198883" watchObservedRunningTime="2025-09-29 09:58:41.592021213 +0000 UTC m=+846.958251477" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.615334 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" podStartSLOduration=6.44808875 podStartE2EDuration="26.615305572s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.372922221 +0000 UTC m=+822.739152485" lastFinishedPulling="2025-09-29 09:58:37.540139043 +0000 UTC m=+842.906369307" observedRunningTime="2025-09-29 09:58:41.610280666 +0000 UTC m=+846.976510930" watchObservedRunningTime="2025-09-29 09:58:41.615305572 +0000 UTC m=+846.981535836" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.650643 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" podStartSLOduration=5.345854903 podStartE2EDuration="26.650619047s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:18.020377632 +0000 UTC m=+823.386607896" lastFinishedPulling="2025-09-29 09:58:39.325141766 +0000 UTC m=+844.691372040" observedRunningTime="2025-09-29 09:58:41.647996827 +0000 UTC m=+847.014227091" watchObservedRunningTime="2025-09-29 09:58:41.650619047 +0000 UTC m=+847.016849311" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.675387 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" podStartSLOduration=6.078028728 podStartE2EDuration="27.675361056s" podCreationTimestamp="2025-09-29 09:58:14 +0000 UTC" firstStartedPulling="2025-09-29 09:58:16.405217412 +0000 UTC m=+821.771447676" lastFinishedPulling="2025-09-29 09:58:38.00254973 +0000 UTC m=+843.368780004" observedRunningTime="2025-09-29 09:58:41.669865188 +0000 UTC m=+847.036095452" watchObservedRunningTime="2025-09-29 09:58:41.675361056 +0000 UTC m=+847.041591320" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.739983 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" podStartSLOduration=7.220103202 podStartE2EDuration="26.739956444s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:18.020314111 +0000 UTC m=+823.386544385" lastFinishedPulling="2025-09-29 09:58:37.540167363 +0000 UTC m=+842.906397627" observedRunningTime="2025-09-29 09:58:41.733953241 +0000 UTC m=+847.100183515" watchObservedRunningTime="2025-09-29 09:58:41.739956444 +0000 UTC m=+847.106186708" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.762009 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" podStartSLOduration=5.816088363 podStartE2EDuration="26.76198468s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.057151476 +0000 UTC m=+822.423381730" lastFinishedPulling="2025-09-29 09:58:38.003047773 +0000 UTC m=+843.369278047" observedRunningTime="2025-09-29 09:58:41.760363736 +0000 UTC m=+847.126594010" watchObservedRunningTime="2025-09-29 09:58:41.76198468 +0000 UTC m=+847.128214944" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.819973 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" podStartSLOduration=6.696359032 podStartE2EDuration="26.819954427s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.416591979 +0000 UTC m=+822.782822243" lastFinishedPulling="2025-09-29 09:58:37.540187374 +0000 UTC m=+842.906417638" observedRunningTime="2025-09-29 09:58:41.78158951 +0000 UTC m=+847.147819774" watchObservedRunningTime="2025-09-29 09:58:41.819954427 +0000 UTC m=+847.186184681" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.830019 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" podStartSLOduration=5.459203079 podStartE2EDuration="26.82998988s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.706168081 +0000 UTC m=+823.072398345" lastFinishedPulling="2025-09-29 09:58:39.076954882 +0000 UTC m=+844.443185146" observedRunningTime="2025-09-29 09:58:41.816964967 +0000 UTC m=+847.183195241" watchObservedRunningTime="2025-09-29 09:58:41.82998988 +0000 UTC m=+847.196220144" Sep 29 09:58:41 crc kubenswrapper[4922]: I0929 09:58:41.837256 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" podStartSLOduration=5.967871153 podStartE2EDuration="26.837233085s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:16.671919151 +0000 UTC m=+822.038149415" lastFinishedPulling="2025-09-29 09:58:37.541281083 +0000 UTC m=+842.907511347" observedRunningTime="2025-09-29 09:58:41.834576953 +0000 UTC m=+847.200807217" watchObservedRunningTime="2025-09-29 09:58:41.837233085 +0000 UTC m=+847.203463349" Sep 29 09:58:42 crc kubenswrapper[4922]: I0929 09:58:42.426128 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.366520 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6495d75b5-wxkn7" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.409650 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748c574d75-8vhzj" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.504813 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8ff95898-jhr5q" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.551800 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9fc8d5567-jbs5x" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.556094 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-695847bc78-ft79c" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.740318 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-67b5d44b7f-ph6mk" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.757087 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d74f4d695-45kw4" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.806493 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54d766c9f9-r4lcr" Sep 29 09:58:45 crc kubenswrapper[4922]: I0929 09:58:45.922463 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56cf9c6b99-r8wxg" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.150459 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-858cd69f49-8pnnw" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.162286 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-82g7r" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.283068 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-lcd9t" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.486037 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-p2hsw" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.513311 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5bf96cfbc4-5s5wf" Sep 29 09:58:46 crc kubenswrapper[4922]: I0929 09:58:46.531411 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-lfh4h" Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.523610 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" event={"ID":"c4ba5f8a-ca61-4870-bc8e-017e79e139a5","Type":"ContainerStarted","Data":"521f738ca7b0c93c2a8d8237bd9bef165f64f09c9cad90ab493a03f28fd8bb4c"} Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.524767 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.526464 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" event={"ID":"c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3","Type":"ContainerStarted","Data":"dbb5e9565d26824b001e45451347ce833aed478aeccd2845858942449434d0f4"} Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.528994 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" event={"ID":"a01ec1f8-817f-4ed8-9431-01847d4956be","Type":"ContainerStarted","Data":"ccf5acd8419647dd77e995e035810bafe34bd7cc1970eb1f3b0c2284936505d9"} Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.529273 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.554727 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" podStartSLOduration=3.232812044 podStartE2EDuration="38.554701621s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.697586567 +0000 UTC m=+823.063816831" lastFinishedPulling="2025-09-29 09:58:53.019476114 +0000 UTC m=+858.385706408" observedRunningTime="2025-09-29 09:58:53.552965794 +0000 UTC m=+858.919196068" watchObservedRunningTime="2025-09-29 09:58:53.554701621 +0000 UTC m=+858.920931895" Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.586472 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" podStartSLOduration=2.809465311 podStartE2EDuration="38.58643926s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.386325156 +0000 UTC m=+822.752555420" lastFinishedPulling="2025-09-29 09:58:53.163299105 +0000 UTC m=+858.529529369" observedRunningTime="2025-09-29 09:58:53.582352339 +0000 UTC m=+858.948582633" watchObservedRunningTime="2025-09-29 09:58:53.58643926 +0000 UTC m=+858.952669564" Sep 29 09:58:53 crc kubenswrapper[4922]: I0929 09:58:53.602089 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-lspp8" podStartSLOduration=3.659156229 podStartE2EDuration="38.602055612s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:18.070398584 +0000 UTC m=+823.436628848" lastFinishedPulling="2025-09-29 09:58:53.013297927 +0000 UTC m=+858.379528231" observedRunningTime="2025-09-29 09:58:53.599609927 +0000 UTC m=+858.965840221" watchObservedRunningTime="2025-09-29 09:58:53.602055612 +0000 UTC m=+858.968285916" Sep 29 09:58:54 crc kubenswrapper[4922]: I0929 09:58:54.544639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" event={"ID":"c38d04c4-b717-4155-b646-b06c3dac3386","Type":"ContainerStarted","Data":"b3755eb37e08884e1a0dc8778e61ad93dea694201bcf18edc91e0926ed4cd22e"} Sep 29 09:58:54 crc kubenswrapper[4922]: I0929 09:58:54.571153 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" podStartSLOduration=3.288971964 podStartE2EDuration="39.571124345s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.699376026 +0000 UTC m=+823.065606290" lastFinishedPulling="2025-09-29 09:58:53.981528367 +0000 UTC m=+859.347758671" observedRunningTime="2025-09-29 09:58:54.56575725 +0000 UTC m=+859.931987554" watchObservedRunningTime="2025-09-29 09:58:54.571124345 +0000 UTC m=+859.937354619" Sep 29 09:58:55 crc kubenswrapper[4922]: I0929 09:58:55.558178 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" event={"ID":"39c6dedb-23e2-4515-83c8-1e85e0136cc8","Type":"ContainerStarted","Data":"617862454e5bd4ea22eebf1e9bbdd9a1302073edbb0614a592c4ac7438977a8b"} Sep 29 09:58:55 crc kubenswrapper[4922]: I0929 09:58:55.558988 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:58:55 crc kubenswrapper[4922]: I0929 09:58:55.579923 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" podStartSLOduration=2.707156951 podStartE2EDuration="40.579899642s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.070278414 +0000 UTC m=+822.436508678" lastFinishedPulling="2025-09-29 09:58:54.943021115 +0000 UTC m=+860.309251369" observedRunningTime="2025-09-29 09:58:55.578190886 +0000 UTC m=+860.944421160" watchObservedRunningTime="2025-09-29 09:58:55.579899642 +0000 UTC m=+860.946129916" Sep 29 09:58:55 crc kubenswrapper[4922]: I0929 09:58:55.970793 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.568292 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" event={"ID":"9c51299d-7ce3-4dff-b555-8cc2bcee6e4c","Type":"ContainerStarted","Data":"6e9b81649ac91fc1ae6cac61e20aeb84fbdeedd056a1e66d60627921160b5127"} Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.568607 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.570458 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" event={"ID":"60630351-afcc-4792-bb16-5994368117cd","Type":"ContainerStarted","Data":"1988870757c63a170589cc06333d419aebae3d4c10692d9d10e651d9c119e4f9"} Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.570793 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.592067 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" podStartSLOduration=3.05398982 podStartE2EDuration="41.592027879s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.416722413 +0000 UTC m=+822.782952677" lastFinishedPulling="2025-09-29 09:58:55.954760472 +0000 UTC m=+861.320990736" observedRunningTime="2025-09-29 09:58:56.587656731 +0000 UTC m=+861.953886995" watchObservedRunningTime="2025-09-29 09:58:56.592027879 +0000 UTC m=+861.958258163" Sep 29 09:58:56 crc kubenswrapper[4922]: I0929 09:58:56.619126 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" podStartSLOduration=3.139882112 podStartE2EDuration="41.619097431s" podCreationTimestamp="2025-09-29 09:58:15 +0000 UTC" firstStartedPulling="2025-09-29 09:58:17.409893987 +0000 UTC m=+822.776124241" lastFinishedPulling="2025-09-29 09:58:55.889109296 +0000 UTC m=+861.255339560" observedRunningTime="2025-09-29 09:58:56.61094454 +0000 UTC m=+861.977174834" watchObservedRunningTime="2025-09-29 09:58:56.619097431 +0000 UTC m=+861.985327705" Sep 29 09:58:59 crc kubenswrapper[4922]: I0929 09:58:59.071051 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:58:59 crc kubenswrapper[4922]: I0929 09:58:59.073094 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:59:05 crc kubenswrapper[4922]: I0929 09:59:05.594665 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7bf498966c-xns9g" Sep 29 09:59:05 crc kubenswrapper[4922]: I0929 09:59:05.684025 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-687b9cf756-b69kd" Sep 29 09:59:05 crc kubenswrapper[4922]: I0929 09:59:05.849342 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-knzwz" Sep 29 09:59:05 crc kubenswrapper[4922]: I0929 09:59:05.869372 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-5wnjj" Sep 29 09:59:05 crc kubenswrapper[4922]: I0929 09:59:05.972519 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-774b97b48-pnns5" Sep 29 09:59:06 crc kubenswrapper[4922]: I0929 09:59:06.022496 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5f95c46c78-rvj4d" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.432933 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434019 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="extract-utilities" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434033 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="extract-utilities" Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434081 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434087 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434097 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="extract-content" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434103 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="extract-content" Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434118 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="extract-utilities" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434124 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="extract-utilities" Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434145 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="extract-content" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434152 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="extract-content" Sep 29 09:59:21 crc kubenswrapper[4922]: E0929 09:59:21.434165 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434171 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434346 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d0ac0f-4be9-485a-8869-21d69d8f86b4" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.434382 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b90a978-860b-46f3-a6c8-c7da96bcab3c" containerName="registry-server" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.435350 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.443310 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.443481 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.443505 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hgqwg" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.443701 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.446291 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.481913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.482049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9j84\" (UniqueName: \"kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.500339 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.504547 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.508469 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.529575 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.583618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.583684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t69w\" (UniqueName: \"kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.583726 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9j84\" (UniqueName: \"kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.583966 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.583993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.585087 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.604397 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9j84\" (UniqueName: \"kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84\") pod \"dnsmasq-dns-675f4bcbfc-fjxrj\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.685871 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.686401 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t69w\" (UniqueName: \"kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.686578 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.686891 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.687479 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.706617 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t69w\" (UniqueName: \"kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w\") pod \"dnsmasq-dns-78dd6ddcc-s2td6\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.770615 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:21 crc kubenswrapper[4922]: I0929 09:59:21.827492 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:22 crc kubenswrapper[4922]: I0929 09:59:22.052901 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:22 crc kubenswrapper[4922]: I0929 09:59:22.123416 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:22 crc kubenswrapper[4922]: W0929 09:59:22.126628 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd52948d1_a1df_4e02_8729_a720f5b2f748.slice/crio-79b34980173307589adba261517375d5069cf07b8a93bda36f0fc8549ed9c0c2 WatchSource:0}: Error finding container 79b34980173307589adba261517375d5069cf07b8a93bda36f0fc8549ed9c0c2: Status 404 returned error can't find the container with id 79b34980173307589adba261517375d5069cf07b8a93bda36f0fc8549ed9c0c2 Sep 29 09:59:22 crc kubenswrapper[4922]: I0929 09:59:22.836580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" event={"ID":"8425b0c8-2de6-4fc9-8b34-ba91b8034c32","Type":"ContainerStarted","Data":"a197ab2c29358123ada14e70ef8480dbc44feb0e3354a455cc0902f8b4db878d"} Sep 29 09:59:22 crc kubenswrapper[4922]: I0929 09:59:22.838582 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" event={"ID":"d52948d1-a1df-4e02-8729-a720f5b2f748","Type":"ContainerStarted","Data":"79b34980173307589adba261517375d5069cf07b8a93bda36f0fc8549ed9c0c2"} Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.237808 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.278680 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.289144 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.289007 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.348205 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skjc2\" (UniqueName: \"kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.348272 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.348322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.453393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skjc2\" (UniqueName: \"kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.453459 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.453500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.454575 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.455043 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.476173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skjc2\" (UniqueName: \"kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2\") pod \"dnsmasq-dns-5ccc8479f9-7ssvz\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.581577 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.616891 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.620787 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.622412 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.641798 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.657180 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.657696 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpk5g\" (UniqueName: \"kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.657761 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.759517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.759593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpk5g\" (UniqueName: \"kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.759642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.760741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.760962 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:24 crc kubenswrapper[4922]: I0929 09:59:24.784405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpk5g\" (UniqueName: \"kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g\") pod \"dnsmasq-dns-57d769cc4f-vndlk\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.026056 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.143063 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:25 crc kubenswrapper[4922]: W0929 09:59:25.153251 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05909797_8c26_44c6_8214_ac4e8b981900.slice/crio-486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b WatchSource:0}: Error finding container 486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b: Status 404 returned error can't find the container with id 486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.442661 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.444867 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.447431 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.450198 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.450408 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s9fwb" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.452488 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.453628 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.453823 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.454090 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.463465 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475274 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475354 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475413 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ndd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475515 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475560 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475628 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475657 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.475936 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.550922 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 09:59:25 crc kubenswrapper[4922]: W0929 09:59:25.558309 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2026402b_4401_489a_9f34_264a57ec2501.slice/crio-0527ffdf9fbc6dc23ff4f6f3b835a5cdd6860e8215b09792874e13e1007c00ee WatchSource:0}: Error finding container 0527ffdf9fbc6dc23ff4f6f3b835a5cdd6860e8215b09792874e13e1007c00ee: Status 404 returned error can't find the container with id 0527ffdf9fbc6dc23ff4f6f3b835a5cdd6860e8215b09792874e13e1007c00ee Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577814 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577940 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577957 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.577995 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ndd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.578027 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.578044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.578078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.578099 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.578122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.579047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.580425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.580433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.580947 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.581219 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.581586 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.585255 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.585971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.586138 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.587710 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.595897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ndd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.614209 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.758616 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.760175 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768224 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768252 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768296 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768224 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768760 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p72nv" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.768785 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.769069 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.775060 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.775961 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781290 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781368 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781500 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781565 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781591 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781655 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.781922 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghhr\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883364 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghhr\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883426 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883456 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883520 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883545 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883608 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883653 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883680 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.883712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.884075 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.886212 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.886853 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.887125 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" event={"ID":"05909797-8c26-44c6-8214-ac4e8b981900","Type":"ContainerStarted","Data":"486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b"} Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.889752 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.890262 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.891037 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.892227 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.892906 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" event={"ID":"2026402b-4401-489a-9f34-264a57ec2501","Type":"ContainerStarted","Data":"0527ffdf9fbc6dc23ff4f6f3b835a5cdd6860e8215b09792874e13e1007c00ee"} Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.890826 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.894896 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.897403 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.906176 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghhr\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:25 crc kubenswrapper[4922]: I0929 09:59:25.920610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " pod="openstack/rabbitmq-server-0" Sep 29 09:59:26 crc kubenswrapper[4922]: I0929 09:59:26.143431 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.411648 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.425953 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.432730 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cbv9z" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.433042 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.433185 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.433678 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.436934 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.448364 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.465637 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.529934 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530101 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530314 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530352 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530384 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkvn\" (UniqueName: \"kubernetes.io/projected/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kube-api-access-sdkvn\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-secrets\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530582 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530697 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.530786 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633270 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633335 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633360 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkvn\" (UniqueName: \"kubernetes.io/projected/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kube-api-access-sdkvn\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633386 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-secrets\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633519 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633576 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.633689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.635790 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.635823 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.636741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.637132 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.637222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f2f851a-d7af-4580-8867-6865c5f1d4ce-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.643042 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.646102 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-secrets\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.654682 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f851a-d7af-4580-8867-6865c5f1d4ce-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.667650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkvn\" (UniqueName: \"kubernetes.io/projected/0f2f851a-d7af-4580-8867-6865c5f1d4ce-kube-api-access-sdkvn\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.677325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"0f2f851a-d7af-4580-8867-6865c5f1d4ce\") " pod="openstack/openstack-galera-0" Sep 29 09:59:27 crc kubenswrapper[4922]: I0929 09:59:27.719392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.384748 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.387157 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.389873 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2zs7t" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.391034 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.396320 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.396972 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.417028 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552494 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552577 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552609 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552631 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552704 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw224\" (UniqueName: \"kubernetes.io/projected/f7c255b1-65cb-42e0-b799-e3a735956220-kube-api-access-tw224\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552731 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552756 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.552774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw224\" (UniqueName: \"kubernetes.io/projected/f7c255b1-65cb-42e0-b799-e3a735956220-kube-api-access-tw224\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654905 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.654962 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.655403 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.655505 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.656886 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.657032 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.657543 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c255b1-65cb-42e0-b799-e3a735956220-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.671053 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.671807 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.673039 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f7c255b1-65cb-42e0-b799-e3a735956220-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.683640 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw224\" (UniqueName: \"kubernetes.io/projected/f7c255b1-65cb-42e0-b799-e3a735956220-kube-api-access-tw224\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.745167 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c255b1-65cb-42e0-b799-e3a735956220\") " pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.751975 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.759802 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.766374 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.766606 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.766748 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dldph" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.817714 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.863881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kolla-config\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.863938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.863974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.864248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-config-data\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.864353 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlpht\" (UniqueName: \"kubernetes.io/projected/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kube-api-access-mlpht\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.965731 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-config-data\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.965809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlpht\" (UniqueName: \"kubernetes.io/projected/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kube-api-access-mlpht\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.965917 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kolla-config\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.965945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.965968 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.968095 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kolla-config\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.968140 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-config-data\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.971540 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.977976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:28 crc kubenswrapper[4922]: I0929 09:59:28.984711 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlpht\" (UniqueName: \"kubernetes.io/projected/617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c-kube-api-access-mlpht\") pod \"memcached-0\" (UID: \"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c\") " pod="openstack/memcached-0" Sep 29 09:59:29 crc kubenswrapper[4922]: I0929 09:59:29.019015 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 09:59:29 crc kubenswrapper[4922]: I0929 09:59:29.071138 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:59:29 crc kubenswrapper[4922]: I0929 09:59:29.071225 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:59:29 crc kubenswrapper[4922]: I0929 09:59:29.124013 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.528789 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.530448 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.537438 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fqvct" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.551010 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.598118 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qvz\" (UniqueName: \"kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz\") pod \"kube-state-metrics-0\" (UID: \"13dedc08-5eea-497d-a7f0-8509ea2000c0\") " pod="openstack/kube-state-metrics-0" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.699843 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qvz\" (UniqueName: \"kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz\") pod \"kube-state-metrics-0\" (UID: \"13dedc08-5eea-497d-a7f0-8509ea2000c0\") " pod="openstack/kube-state-metrics-0" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.726787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qvz\" (UniqueName: \"kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz\") pod \"kube-state-metrics-0\" (UID: \"13dedc08-5eea-497d-a7f0-8509ea2000c0\") " pod="openstack/kube-state-metrics-0" Sep 29 09:59:30 crc kubenswrapper[4922]: I0929 09:59:30.856021 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.476163 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6kqsg"] Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.491321 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.493771 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg"] Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.499183 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.499482 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.499704 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7rg8x" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.507238 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bzdcj"] Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.511137 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.565313 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bzdcj"] Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592113 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/404af620-a2df-4414-acfc-b669e8518298-kube-api-access-f4gzc\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592164 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d2ae39-f918-485b-a8c4-b083cdf9d48f-scripts\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592184 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr94t\" (UniqueName: \"kubernetes.io/projected/12d2ae39-f918-485b-a8c4-b083cdf9d48f-kube-api-access-vr94t\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/404af620-a2df-4414-acfc-b669e8518298-scripts\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592258 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-ovn-controller-tls-certs\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-lib\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592303 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592319 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-run\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592338 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-etc-ovs\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592364 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592378 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-log-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592401 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-log\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.592454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-combined-ca-bundle\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696350 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-combined-ca-bundle\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/404af620-a2df-4414-acfc-b669e8518298-kube-api-access-f4gzc\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696520 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d2ae39-f918-485b-a8c4-b083cdf9d48f-scripts\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696541 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr94t\" (UniqueName: \"kubernetes.io/projected/12d2ae39-f918-485b-a8c4-b083cdf9d48f-kube-api-access-vr94t\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696558 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/404af620-a2df-4414-acfc-b669e8518298-scripts\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-ovn-controller-tls-certs\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-lib\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696645 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696661 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-run\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-etc-ovs\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696740 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-log-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.696763 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-log\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697497 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-log\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-run\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-etc-ovs\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-run-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.697897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d2ae39-f918-485b-a8c4-b083cdf9d48f-var-log-ovn\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.700605 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/404af620-a2df-4414-acfc-b669e8518298-scripts\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.700777 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/404af620-a2df-4414-acfc-b669e8518298-var-lib\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.702512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d2ae39-f918-485b-a8c4-b083cdf9d48f-scripts\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.707076 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-combined-ca-bundle\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.713144 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr94t\" (UniqueName: \"kubernetes.io/projected/12d2ae39-f918-485b-a8c4-b083cdf9d48f-kube-api-access-vr94t\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.713403 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d2ae39-f918-485b-a8c4-b083cdf9d48f-ovn-controller-tls-certs\") pod \"ovn-controller-6kqsg\" (UID: \"12d2ae39-f918-485b-a8c4-b083cdf9d48f\") " pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.718487 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/404af620-a2df-4414-acfc-b669e8518298-kube-api-access-f4gzc\") pod \"ovn-controller-ovs-bzdcj\" (UID: \"404af620-a2df-4414-acfc-b669e8518298\") " pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.819138 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:34 crc kubenswrapper[4922]: I0929 09:59:34.845641 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.610943 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.614623 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.619930 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.620431 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-99c74" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.620478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.622577 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.623957 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.632538 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr94w\" (UniqueName: \"kubernetes.io/projected/d267e81e-9044-4619-b2f2-4c370674a31c-kube-api-access-dr94w\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756280 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-config\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756335 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756380 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756398 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.756427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.858549 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.858628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-config\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.858810 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.858956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.858996 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.859084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.859138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.859193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr94w\" (UniqueName: \"kubernetes.io/projected/d267e81e-9044-4619-b2f2-4c370674a31c-kube-api-access-dr94w\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.859710 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.860240 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.860274 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d267e81e-9044-4619-b2f2-4c370674a31c-config\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.860301 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.866044 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.866952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.867872 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d267e81e-9044-4619-b2f2-4c370674a31c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.879918 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr94w\" (UniqueName: \"kubernetes.io/projected/d267e81e-9044-4619-b2f2-4c370674a31c-kube-api-access-dr94w\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.899233 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d267e81e-9044-4619-b2f2-4c370674a31c\") " pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:35 crc kubenswrapper[4922]: I0929 09:59:35.954303 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.164360 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.165874 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.168997 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ws2tp" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.173974 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.174084 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.174104 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.179580 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292368 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292437 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292584 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292627 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj5t4\" (UniqueName: \"kubernetes.io/projected/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-kube-api-access-tj5t4\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.292685 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394461 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394567 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394605 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394637 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394680 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.394712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj5t4\" (UniqueName: \"kubernetes.io/projected/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-kube-api-access-tj5t4\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.396433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.396648 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.396963 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.398313 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.401984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.407794 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.410237 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.419658 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj5t4\" (UniqueName: \"kubernetes.io/projected/d76a9416-91c8-4df6-b6a4-898c4df4ac1a-kube-api-access-tj5t4\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.435611 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d76a9416-91c8-4df6-b6a4-898c4df4ac1a\") " pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:37 crc kubenswrapper[4922]: I0929 09:59:37.506536 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.615566 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.616588 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4t69w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-s2td6_openstack(d52948d1-a1df-4e02-8729-a720f5b2f748): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.617778 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" podUID="d52948d1-a1df-4e02-8729-a720f5b2f748" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.645011 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.645211 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9j84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fjxrj_openstack(8425b0c8-2de6-4fc9-8b34-ba91b8034c32): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.646419 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" podUID="8425b0c8-2de6-4fc9-8b34-ba91b8034c32" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.715697 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.715957 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skjc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-7ssvz_openstack(05909797-8c26-44c6-8214-ac4e8b981900): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 09:59:40 crc kubenswrapper[4922]: E0929 09:59:40.717349 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" podUID="05909797-8c26-44c6-8214-ac4e8b981900" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.135369 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.183346 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.219971 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.228162 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: W0929 09:59:41.231592 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a51d044_d162_4938_8ca4_b4a200e78739.slice/crio-542853881ba52660c1569c79c55c4d7f667bc023ff9e6400c3e1c1aae94d373e WatchSource:0}: Error finding container 542853881ba52660c1569c79c55c4d7f667bc023ff9e6400c3e1c1aae94d373e: Status 404 returned error can't find the container with id 542853881ba52660c1569c79c55c4d7f667bc023ff9e6400c3e1c1aae94d373e Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.299398 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: W0929 09:59:41.300777 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ad8ac2_2191_43ab_9979_9ccbe368d883.slice/crio-a158d2ab863a555aa9cbc884dea642a860a9d3ed49f2aea712cac2338e5101fd WatchSource:0}: Error finding container a158d2ab863a555aa9cbc884dea642a860a9d3ed49f2aea712cac2338e5101fd: Status 404 returned error can't find the container with id a158d2ab863a555aa9cbc884dea642a860a9d3ed49f2aea712cac2338e5101fd Sep 29 09:59:41 crc kubenswrapper[4922]: E0929 09:59:41.370534 4922 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 29 09:59:41 crc kubenswrapper[4922]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/05909797-8c26-44c6-8214-ac4e8b981900/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 09:59:41 crc kubenswrapper[4922]: > podSandboxID="486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b" Sep 29 09:59:41 crc kubenswrapper[4922]: E0929 09:59:41.370727 4922 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 29 09:59:41 crc kubenswrapper[4922]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skjc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-7ssvz_openstack(05909797-8c26-44c6-8214-ac4e8b981900): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/05909797-8c26-44c6-8214-ac4e8b981900/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 09:59:41 crc kubenswrapper[4922]: > logger="UnhandledError" Sep 29 09:59:41 crc kubenswrapper[4922]: E0929 09:59:41.371946 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/05909797-8c26-44c6-8214-ac4e8b981900/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" podUID="05909797-8c26-44c6-8214-ac4e8b981900" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.518608 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.524014 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:41 crc kubenswrapper[4922]: W0929 09:59:41.618719 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c255b1_65cb_42e0_b799_e3a735956220.slice/crio-67b01232a30313af81d09f95ca98efc6b1d7a97d84b845b6818b39133973fdd7 WatchSource:0}: Error finding container 67b01232a30313af81d09f95ca98efc6b1d7a97d84b845b6818b39133973fdd7: Status 404 returned error can't find the container with id 67b01232a30313af81d09f95ca98efc6b1d7a97d84b845b6818b39133973fdd7 Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.622435 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.630083 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.688517 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t69w\" (UniqueName: \"kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w\") pod \"d52948d1-a1df-4e02-8729-a720f5b2f748\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.688669 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9j84\" (UniqueName: \"kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84\") pod \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.688744 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config\") pod \"d52948d1-a1df-4e02-8729-a720f5b2f748\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.688780 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc\") pod \"d52948d1-a1df-4e02-8729-a720f5b2f748\" (UID: \"d52948d1-a1df-4e02-8729-a720f5b2f748\") " Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.688871 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config\") pod \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\" (UID: \"8425b0c8-2de6-4fc9-8b34-ba91b8034c32\") " Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.691897 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config" (OuterVolumeSpecName: "config") pod "d52948d1-a1df-4e02-8729-a720f5b2f748" (UID: "d52948d1-a1df-4e02-8729-a720f5b2f748"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.693122 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d52948d1-a1df-4e02-8729-a720f5b2f748" (UID: "d52948d1-a1df-4e02-8729-a720f5b2f748"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.694103 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config" (OuterVolumeSpecName: "config") pod "8425b0c8-2de6-4fc9-8b34-ba91b8034c32" (UID: "8425b0c8-2de6-4fc9-8b34-ba91b8034c32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.697962 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg"] Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.704395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w" (OuterVolumeSpecName: "kube-api-access-4t69w") pod "d52948d1-a1df-4e02-8729-a720f5b2f748" (UID: "d52948d1-a1df-4e02-8729-a720f5b2f748"). InnerVolumeSpecName "kube-api-access-4t69w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.708552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84" (OuterVolumeSpecName: "kube-api-access-c9j84") pod "8425b0c8-2de6-4fc9-8b34-ba91b8034c32" (UID: "8425b0c8-2de6-4fc9-8b34-ba91b8034c32"). InnerVolumeSpecName "kube-api-access-c9j84". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.798175 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.798219 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t69w\" (UniqueName: \"kubernetes.io/projected/d52948d1-a1df-4e02-8729-a720f5b2f748-kube-api-access-4t69w\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.798237 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9j84\" (UniqueName: \"kubernetes.io/projected/8425b0c8-2de6-4fc9-8b34-ba91b8034c32-kube-api-access-c9j84\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.798249 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.798260 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52948d1-a1df-4e02-8729-a720f5b2f748-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.807909 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 09:59:41 crc kubenswrapper[4922]: W0929 09:59:41.813870 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd76a9416_91c8_4df6_b6a4_898c4df4ac1a.slice/crio-d136a3e218661ace3519b0b940645c3b0c7ca64f526b31be2c652d81fae6fb8f WatchSource:0}: Error finding container d136a3e218661ace3519b0b940645c3b0c7ca64f526b31be2c652d81fae6fb8f: Status 404 returned error can't find the container with id d136a3e218661ace3519b0b940645c3b0c7ca64f526b31be2c652d81fae6fb8f Sep 29 09:59:41 crc kubenswrapper[4922]: I0929 09:59:41.923301 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bzdcj"] Sep 29 09:59:41 crc kubenswrapper[4922]: W0929 09:59:41.923614 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod404af620_a2df_4414_acfc_b669e8518298.slice/crio-e3eb25dee95f781fa557a51bb4a96bc7a4ee1c6f61e85de2ca3960d3b51c3e2a WatchSource:0}: Error finding container e3eb25dee95f781fa557a51bb4a96bc7a4ee1c6f61e85de2ca3960d3b51c3e2a: Status 404 returned error can't find the container with id e3eb25dee95f781fa557a51bb4a96bc7a4ee1c6f61e85de2ca3960d3b51c3e2a Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.063432 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerStarted","Data":"542853881ba52660c1569c79c55c4d7f667bc023ff9e6400c3e1c1aae94d373e"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.065706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d76a9416-91c8-4df6-b6a4-898c4df4ac1a","Type":"ContainerStarted","Data":"d136a3e218661ace3519b0b940645c3b0c7ca64f526b31be2c652d81fae6fb8f"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.067374 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13dedc08-5eea-497d-a7f0-8509ea2000c0","Type":"ContainerStarted","Data":"aea6de548e89658a44fcdd4f43ca950bc84d945f4a7e82a4c9106dc6f277f913"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.068722 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bzdcj" event={"ID":"404af620-a2df-4414-acfc-b669e8518298","Type":"ContainerStarted","Data":"e3eb25dee95f781fa557a51bb4a96bc7a4ee1c6f61e85de2ca3960d3b51c3e2a"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.070549 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f2f851a-d7af-4580-8867-6865c5f1d4ce","Type":"ContainerStarted","Data":"fbc4bb778b12994cdffc34a63ed5dc4943f5b891b6f3b16a70f685e9d2fb503e"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.072325 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerStarted","Data":"a158d2ab863a555aa9cbc884dea642a860a9d3ed49f2aea712cac2338e5101fd"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.075596 4922 generic.go:334] "Generic (PLEG): container finished" podID="2026402b-4401-489a-9f34-264a57ec2501" containerID="e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807" exitCode=0 Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.075672 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" event={"ID":"2026402b-4401-489a-9f34-264a57ec2501","Type":"ContainerDied","Data":"e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.078982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" event={"ID":"8425b0c8-2de6-4fc9-8b34-ba91b8034c32","Type":"ContainerDied","Data":"a197ab2c29358123ada14e70ef8480dbc44feb0e3354a455cc0902f8b4db878d"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.079081 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fjxrj" Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.084229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c","Type":"ContainerStarted","Data":"c6c2295d92f81674c7cd517ee952438904226921dfc7e41c3f1a27fbf4c84722"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.085996 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7c255b1-65cb-42e0-b799-e3a735956220","Type":"ContainerStarted","Data":"67b01232a30313af81d09f95ca98efc6b1d7a97d84b845b6818b39133973fdd7"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.087388 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg" event={"ID":"12d2ae39-f918-485b-a8c4-b083cdf9d48f","Type":"ContainerStarted","Data":"267fa4b62747b7cec5f02313259b597cbf13d37d59df562e080e758f1dd9aae4"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.088898 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" event={"ID":"d52948d1-a1df-4e02-8729-a720f5b2f748","Type":"ContainerDied","Data":"79b34980173307589adba261517375d5069cf07b8a93bda36f0fc8549ed9c0c2"} Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.088968 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s2td6" Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.271240 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.278212 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s2td6"] Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.311028 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.316933 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fjxrj"] Sep 29 09:59:42 crc kubenswrapper[4922]: I0929 09:59:42.856009 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 09:59:42 crc kubenswrapper[4922]: W0929 09:59:42.871720 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd267e81e_9044_4619_b2f2_4c370674a31c.slice/crio-916512083ed6be591199c7765b4547d645750158a445c2da901e4f15181bbdb8 WatchSource:0}: Error finding container 916512083ed6be591199c7765b4547d645750158a445c2da901e4f15181bbdb8: Status 404 returned error can't find the container with id 916512083ed6be591199c7765b4547d645750158a445c2da901e4f15181bbdb8 Sep 29 09:59:43 crc kubenswrapper[4922]: I0929 09:59:43.107401 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d267e81e-9044-4619-b2f2-4c370674a31c","Type":"ContainerStarted","Data":"916512083ed6be591199c7765b4547d645750158a445c2da901e4f15181bbdb8"} Sep 29 09:59:43 crc kubenswrapper[4922]: I0929 09:59:43.475255 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8425b0c8-2de6-4fc9-8b34-ba91b8034c32" path="/var/lib/kubelet/pods/8425b0c8-2de6-4fc9-8b34-ba91b8034c32/volumes" Sep 29 09:59:43 crc kubenswrapper[4922]: I0929 09:59:43.476429 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52948d1-a1df-4e02-8729-a720f5b2f748" path="/var/lib/kubelet/pods/d52948d1-a1df-4e02-8729-a720f5b2f748/volumes" Sep 29 09:59:45 crc kubenswrapper[4922]: I0929 09:59:45.131348 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" event={"ID":"2026402b-4401-489a-9f34-264a57ec2501","Type":"ContainerStarted","Data":"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4"} Sep 29 09:59:45 crc kubenswrapper[4922]: I0929 09:59:45.131808 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:45 crc kubenswrapper[4922]: I0929 09:59:45.156234 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" podStartSLOduration=5.687552263 podStartE2EDuration="21.156193705s" podCreationTimestamp="2025-09-29 09:59:24 +0000 UTC" firstStartedPulling="2025-09-29 09:59:25.561443885 +0000 UTC m=+890.927674149" lastFinishedPulling="2025-09-29 09:59:41.030085327 +0000 UTC m=+906.396315591" observedRunningTime="2025-09-29 09:59:45.152653659 +0000 UTC m=+910.518883913" watchObservedRunningTime="2025-09-29 09:59:45.156193705 +0000 UTC m=+910.522423969" Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.029480 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.097917 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.765626 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.930058 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc\") pod \"05909797-8c26-44c6-8214-ac4e8b981900\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.930104 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config\") pod \"05909797-8c26-44c6-8214-ac4e8b981900\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.930312 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skjc2\" (UniqueName: \"kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2\") pod \"05909797-8c26-44c6-8214-ac4e8b981900\" (UID: \"05909797-8c26-44c6-8214-ac4e8b981900\") " Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.943158 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2" (OuterVolumeSpecName: "kube-api-access-skjc2") pod "05909797-8c26-44c6-8214-ac4e8b981900" (UID: "05909797-8c26-44c6-8214-ac4e8b981900"). InnerVolumeSpecName "kube-api-access-skjc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.979876 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config" (OuterVolumeSpecName: "config") pod "05909797-8c26-44c6-8214-ac4e8b981900" (UID: "05909797-8c26-44c6-8214-ac4e8b981900"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:50 crc kubenswrapper[4922]: I0929 09:59:50.983063 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05909797-8c26-44c6-8214-ac4e8b981900" (UID: "05909797-8c26-44c6-8214-ac4e8b981900"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.034081 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-config\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.034118 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05909797-8c26-44c6-8214-ac4e8b981900-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.034128 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skjc2\" (UniqueName: \"kubernetes.io/projected/05909797-8c26-44c6-8214-ac4e8b981900-kube-api-access-skjc2\") on node \"crc\" DevicePath \"\"" Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.190131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" event={"ID":"05909797-8c26-44c6-8214-ac4e8b981900","Type":"ContainerDied","Data":"486f1b961660d6dd45496e33a07ff2e7057a4b3fe39fdb76cb9146b99b3ec76b"} Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.190314 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7ssvz" Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.271538 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.278411 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7ssvz"] Sep 29 09:59:51 crc kubenswrapper[4922]: I0929 09:59:51.461325 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05909797-8c26-44c6-8214-ac4e8b981900" path="/var/lib/kubelet/pods/05909797-8c26-44c6-8214-ac4e8b981900/volumes" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.222890 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg" event={"ID":"12d2ae39-f918-485b-a8c4-b083cdf9d48f","Type":"ContainerStarted","Data":"f43f4b115fd429d1cdcb67eb020d520765970f0de0a13291c73980c625e716b8"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.223675 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6kqsg" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.224323 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerStarted","Data":"faadddd4d5d9c294d7d0d82cbdccb37186b92c157c6a6cbb4ca84753ab65f49a"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.227221 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d267e81e-9044-4619-b2f2-4c370674a31c","Type":"ContainerStarted","Data":"d2e7142d2555c979f3443ee205671894506e72438393f3b35a23a98458688c51"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.229245 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d76a9416-91c8-4df6-b6a4-898c4df4ac1a","Type":"ContainerStarted","Data":"56c88e56d1f2e93af7cbc349db84454ecc4d8504e18f8d12fa41c0c8c2daa652"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.231602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13dedc08-5eea-497d-a7f0-8509ea2000c0","Type":"ContainerStarted","Data":"b57f2910ae8de2ced077b6288e0736d8935e9d6277482b913d7cef55133f0655"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.231853 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.237341 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7c255b1-65cb-42e0-b799-e3a735956220","Type":"ContainerStarted","Data":"cbcbbea445782568e798b968345291aaa4e1f31569b5c459318b77830e608100"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.240222 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f2f851a-d7af-4580-8867-6865c5f1d4ce","Type":"ContainerStarted","Data":"3f381c499d7ac22838b74105b999e26ce80ffcc4ebef774ed44087a582d3b3fa"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.266925 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerStarted","Data":"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.276241 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c","Type":"ContainerStarted","Data":"5d442f9d497716689c8f8130127f1aa0a6d9a80f7bf66a4b6de9db2cee38562e"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.276972 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.282335 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6kqsg" podStartSLOduration=9.633349834 podStartE2EDuration="21.282309258s" podCreationTimestamp="2025-09-29 09:59:34 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.722454005 +0000 UTC m=+907.088684289" lastFinishedPulling="2025-09-29 09:59:53.371413409 +0000 UTC m=+918.737643713" observedRunningTime="2025-09-29 09:59:55.259270435 +0000 UTC m=+920.625500719" watchObservedRunningTime="2025-09-29 09:59:55.282309258 +0000 UTC m=+920.648539522" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.303984 4922 generic.go:334] "Generic (PLEG): container finished" podID="404af620-a2df-4414-acfc-b669e8518298" containerID="c6b7bf7d2195817e6719ba3e4639b0e3211b355393e7191e410c8ef427535194" exitCode=0 Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.304058 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bzdcj" event={"ID":"404af620-a2df-4414-acfc-b669e8518298","Type":"ContainerDied","Data":"c6b7bf7d2195817e6719ba3e4639b0e3211b355393e7191e410c8ef427535194"} Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.381606 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.76367964 podStartE2EDuration="25.381584763s" podCreationTimestamp="2025-09-29 09:59:30 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.618559145 +0000 UTC m=+906.984789409" lastFinishedPulling="2025-09-29 09:59:54.236464268 +0000 UTC m=+919.602694532" observedRunningTime="2025-09-29 09:59:55.375436777 +0000 UTC m=+920.741667131" watchObservedRunningTime="2025-09-29 09:59:55.381584763 +0000 UTC m=+920.747815027" Sep 29 09:59:55 crc kubenswrapper[4922]: I0929 09:59:55.404118 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.610683359 podStartE2EDuration="27.404090822s" podCreationTimestamp="2025-09-29 09:59:28 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.234569028 +0000 UTC m=+906.600799292" lastFinishedPulling="2025-09-29 09:59:52.027976491 +0000 UTC m=+917.394206755" observedRunningTime="2025-09-29 09:59:55.400588087 +0000 UTC m=+920.766818351" watchObservedRunningTime="2025-09-29 09:59:55.404090822 +0000 UTC m=+920.770321096" Sep 29 09:59:56 crc kubenswrapper[4922]: I0929 09:59:56.321264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bzdcj" event={"ID":"404af620-a2df-4414-acfc-b669e8518298","Type":"ContainerStarted","Data":"003d88e145bb327074ef20c6f59ebd0f0ef3e4ba41fdf9bcfb41fdb8e1c25923"} Sep 29 09:59:56 crc kubenswrapper[4922]: I0929 09:59:56.321653 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bzdcj" event={"ID":"404af620-a2df-4414-acfc-b669e8518298","Type":"ContainerStarted","Data":"0ca7940e027cb5d1bc01664359e1c42a90a7fdf33f191442f8086c20d499c276"} Sep 29 09:59:56 crc kubenswrapper[4922]: I0929 09:59:56.322044 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:56 crc kubenswrapper[4922]: I0929 09:59:56.342168 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bzdcj" podStartSLOduration=11.305831103 podStartE2EDuration="22.342151105s" podCreationTimestamp="2025-09-29 09:59:34 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.92632154 +0000 UTC m=+907.292551814" lastFinishedPulling="2025-09-29 09:59:52.962641522 +0000 UTC m=+918.328871816" observedRunningTime="2025-09-29 09:59:56.339808412 +0000 UTC m=+921.706038676" watchObservedRunningTime="2025-09-29 09:59:56.342151105 +0000 UTC m=+921.708381369" Sep 29 09:59:57 crc kubenswrapper[4922]: I0929 09:59:57.332429 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 09:59:58 crc kubenswrapper[4922]: E0929 09:59:58.341746 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f2f851a_d7af_4580_8867_6865c5f1d4ce.slice/crio-3f381c499d7ac22838b74105b999e26ce80ffcc4ebef774ed44087a582d3b3fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f2f851a_d7af_4580_8867_6865c5f1d4ce.slice/crio-conmon-3f381c499d7ac22838b74105b999e26ce80ffcc4ebef774ed44087a582d3b3fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c255b1_65cb_42e0_b799_e3a735956220.slice/crio-cbcbbea445782568e798b968345291aaa4e1f31569b5c459318b77830e608100.scope\": RecentStats: unable to find data in memory cache]" Sep 29 09:59:58 crc kubenswrapper[4922]: I0929 09:59:58.348257 4922 generic.go:334] "Generic (PLEG): container finished" podID="f7c255b1-65cb-42e0-b799-e3a735956220" containerID="cbcbbea445782568e798b968345291aaa4e1f31569b5c459318b77830e608100" exitCode=0 Sep 29 09:59:58 crc kubenswrapper[4922]: I0929 09:59:58.348356 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7c255b1-65cb-42e0-b799-e3a735956220","Type":"ContainerDied","Data":"cbcbbea445782568e798b968345291aaa4e1f31569b5c459318b77830e608100"} Sep 29 09:59:58 crc kubenswrapper[4922]: I0929 09:59:58.350929 4922 generic.go:334] "Generic (PLEG): container finished" podID="0f2f851a-d7af-4580-8867-6865c5f1d4ce" containerID="3f381c499d7ac22838b74105b999e26ce80ffcc4ebef774ed44087a582d3b3fa" exitCode=0 Sep 29 09:59:58 crc kubenswrapper[4922]: I0929 09:59:58.351038 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f2f851a-d7af-4580-8867-6865c5f1d4ce","Type":"ContainerDied","Data":"3f381c499d7ac22838b74105b999e26ce80ffcc4ebef774ed44087a582d3b3fa"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.071371 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.072460 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.072658 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.073911 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.074157 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217" gracePeriod=600 Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.127718 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.362264 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217" exitCode=0 Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.362345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.362418 4922 scope.go:117] "RemoveContainer" containerID="438f14a9f27df3e3e3379a1de404ccf8246b85d1a7a877658b63d5fd223866ed" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.365363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7c255b1-65cb-42e0-b799-e3a735956220","Type":"ContainerStarted","Data":"303dec763c074158368a5c713b0c8eaa54cba324266245516c59d0ebd17018c6"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.369532 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d267e81e-9044-4619-b2f2-4c370674a31c","Type":"ContainerStarted","Data":"574bca556e655c63c6f5a15213152ee56e51466b1db6f4656578c4eee4633e97"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.374286 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d76a9416-91c8-4df6-b6a4-898c4df4ac1a","Type":"ContainerStarted","Data":"f600c1b074ef1e1a02fb8264f638b5f19c76e0d6554af268416e41ca6cacbdee"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.376131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f2f851a-d7af-4580-8867-6865c5f1d4ce","Type":"ContainerStarted","Data":"afc7812ae65f26a06520685ae7867c4dd5996531887c1736721f5e1c667e057d"} Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.392075 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.136689664 podStartE2EDuration="32.392050902s" podCreationTimestamp="2025-09-29 09:59:27 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.621372741 +0000 UTC m=+906.987603005" lastFinishedPulling="2025-09-29 09:59:52.876733979 +0000 UTC m=+918.242964243" observedRunningTime="2025-09-29 09:59:59.387499779 +0000 UTC m=+924.753730063" watchObservedRunningTime="2025-09-29 09:59:59.392050902 +0000 UTC m=+924.758281186" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.414215 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.836106479 podStartE2EDuration="23.414189691s" podCreationTimestamp="2025-09-29 09:59:36 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.818109533 +0000 UTC m=+907.184339797" lastFinishedPulling="2025-09-29 09:59:58.396192735 +0000 UTC m=+923.762423009" observedRunningTime="2025-09-29 09:59:59.411013436 +0000 UTC m=+924.777243700" watchObservedRunningTime="2025-09-29 09:59:59.414189691 +0000 UTC m=+924.780419955" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.435984 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.24514313 podStartE2EDuration="33.435958991s" podCreationTimestamp="2025-09-29 09:59:26 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.183119437 +0000 UTC m=+906.549349691" lastFinishedPulling="2025-09-29 09:59:53.373935258 +0000 UTC m=+918.740165552" observedRunningTime="2025-09-29 09:59:59.433686969 +0000 UTC m=+924.799917243" watchObservedRunningTime="2025-09-29 09:59:59.435958991 +0000 UTC m=+924.802189255" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.464483 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.925126486 podStartE2EDuration="25.464451721s" podCreationTimestamp="2025-09-29 09:59:34 +0000 UTC" firstStartedPulling="2025-09-29 09:59:42.87425305 +0000 UTC m=+908.240483324" lastFinishedPulling="2025-09-29 09:59:58.413578285 +0000 UTC m=+923.779808559" observedRunningTime="2025-09-29 09:59:59.463517295 +0000 UTC m=+924.829747569" watchObservedRunningTime="2025-09-29 09:59:59.464451721 +0000 UTC m=+924.830681985" Sep 29 09:59:59 crc kubenswrapper[4922]: I0929 09:59:59.955213 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.015015 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.152363 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9"] Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.153753 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.156241 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.156652 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.158622 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9"] Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.245737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hscn\" (UniqueName: \"kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.246085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.246354 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.348422 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hscn\" (UniqueName: \"kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.348956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.349102 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.349944 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.370215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.374431 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hscn\" (UniqueName: \"kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn\") pod \"collect-profiles-29319000-5vgv9\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.391034 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2"} Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.392161 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.453239 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.520179 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.862798 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.864767 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.869098 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.889429 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.913666 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.965917 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.965970 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87295\" (UniqueName: \"kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.966020 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:00 crc kubenswrapper[4922]: I0929 10:00:00.966096 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.025977 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cgtjg"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.027372 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.033229 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070344 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070494 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070591 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmb8\" (UniqueName: \"kubernetes.io/projected/cf954c93-5942-433b-bbb7-6f0737969eb5-kube-api-access-fdmb8\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-combined-ca-bundle\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070695 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovn-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070739 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovs-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070761 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf954c93-5942-433b-bbb7-6f0737969eb5-config\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.070796 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87295\" (UniqueName: \"kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.072821 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cgtjg"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.076107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.076135 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.081504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.111053 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87295\" (UniqueName: \"kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295\") pod \"dnsmasq-dns-7f896c8c65-tzngn\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-combined-ca-bundle\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176213 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovn-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovs-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176273 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf954c93-5942-433b-bbb7-6f0737969eb5-config\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176347 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176387 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmb8\" (UniqueName: \"kubernetes.io/projected/cf954c93-5942-433b-bbb7-6f0737969eb5-kube-api-access-fdmb8\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.176938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovs-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.177504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf954c93-5942-433b-bbb7-6f0737969eb5-config\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.178256 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf954c93-5942-433b-bbb7-6f0737969eb5-ovn-rundir\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.188425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.210397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.222349 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf954c93-5942-433b-bbb7-6f0737969eb5-combined-ca-bundle\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.227510 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmb8\" (UniqueName: \"kubernetes.io/projected/cf954c93-5942-433b-bbb7-6f0737969eb5-kube-api-access-fdmb8\") pod \"ovn-controller-metrics-cgtjg\" (UID: \"cf954c93-5942-433b-bbb7-6f0737969eb5\") " pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.285170 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.293806 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9"] Sep 29 10:00:01 crc kubenswrapper[4922]: W0929 10:00:01.324903 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ae87e51_bfce_41e2_b41a_327df982e7aa.slice/crio-23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf WatchSource:0}: Error finding container 23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf: Status 404 returned error can't find the container with id 23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.374100 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-pk48l"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.375655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.391454 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cgtjg" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.423868 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-pk48l"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.427470 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" event={"ID":"6ae87e51-bfce-41e2-b41a-327df982e7aa","Type":"ContainerStarted","Data":"23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf"} Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.487366 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frhv\" (UniqueName: \"kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.487944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.487974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.488021 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.498315 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-pk48l"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.507611 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.545798 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.547765 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: E0929 10:00:01.556908 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-6frhv ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" podUID="33790736-04eb-4bb1-b17f-512e8815f939" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.566805 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.591613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frhv\" (UniqueName: \"kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.591774 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.591816 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.591910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.592150 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.592183 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbkk\" (UniqueName: \"kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.592231 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.592283 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.592311 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.595283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.595300 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.595729 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.603698 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.607914 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.627499 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frhv\" (UniqueName: \"kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv\") pod \"dnsmasq-dns-6c89d5d749-pk48l\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.750807 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.751454 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbkk\" (UniqueName: \"kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.751502 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.751537 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.751568 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.753430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.754573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.755551 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.765188 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.769228 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:01 crc kubenswrapper[4922]: W0929 10:00:01.778645 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ffb184_927a_4812_93a5_343a35a86cbf.slice/crio-fcae2883642512f43979e233e8643d610d0b8c3e91aa17292d09cbc521a5b82c WatchSource:0}: Error finding container fcae2883642512f43979e233e8643d610d0b8c3e91aa17292d09cbc521a5b82c: Status 404 returned error can't find the container with id fcae2883642512f43979e233e8643d610d0b8c3e91aa17292d09cbc521a5b82c Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.795982 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbkk\" (UniqueName: \"kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk\") pod \"dnsmasq-dns-698758b865-8nm2t\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:01 crc kubenswrapper[4922]: I0929 10:00:01.925257 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.035389 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cgtjg"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.373271 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.378760 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.381473 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.382005 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.382181 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rmlt5" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.382227 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.404121 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.440768 4922 generic.go:334] "Generic (PLEG): container finished" podID="95ffb184-927a-4812-93a5-343a35a86cbf" containerID="803cc67e1ab3eb92e74b9c8b48b46928bf1c25318b5e312c9e778d53abcbcb72" exitCode=0 Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.441162 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" event={"ID":"95ffb184-927a-4812-93a5-343a35a86cbf","Type":"ContainerDied","Data":"803cc67e1ab3eb92e74b9c8b48b46928bf1c25318b5e312c9e778d53abcbcb72"} Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.441198 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" event={"ID":"95ffb184-927a-4812-93a5-343a35a86cbf","Type":"ContainerStarted","Data":"fcae2883642512f43979e233e8643d610d0b8c3e91aa17292d09cbc521a5b82c"} Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.452959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cgtjg" event={"ID":"cf954c93-5942-433b-bbb7-6f0737969eb5","Type":"ContainerStarted","Data":"09f1231768cf68582692800c5f8616cd539cf09fde38b9b53d973c2bc19c122f"} Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.453013 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cgtjg" event={"ID":"cf954c93-5942-433b-bbb7-6f0737969eb5","Type":"ContainerStarted","Data":"de3e0642b6b6a5695bb5a152b54c0128fa90fa404e4a1be75dd09a3a218a164c"} Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.456724 4922 generic.go:334] "Generic (PLEG): container finished" podID="6ae87e51-bfce-41e2-b41a-327df982e7aa" containerID="40f68fa2354de3ddc41781cf3181216cd74d2a28888e580f2819a08cf20cc97d" exitCode=0 Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.456933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" event={"ID":"6ae87e51-bfce-41e2-b41a-327df982e7aa","Type":"ContainerDied","Data":"40f68fa2354de3ddc41781cf3181216cd74d2a28888e580f2819a08cf20cc97d"} Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.457069 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.457357 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.467443 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9wz\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-kube-api-access-sn9wz\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.467647 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-lock\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.467763 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.467874 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.468092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-cache\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.473515 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.493347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.525910 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cgtjg" podStartSLOduration=2.525884199 podStartE2EDuration="2.525884199s" podCreationTimestamp="2025-09-29 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:02.517787161 +0000 UTC m=+927.884017425" watchObservedRunningTime="2025-09-29 10:00:02.525884199 +0000 UTC m=+927.892114463" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.542209 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.573466 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frhv\" (UniqueName: \"kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv\") pod \"33790736-04eb-4bb1-b17f-512e8815f939\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.573668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb\") pod \"33790736-04eb-4bb1-b17f-512e8815f939\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.573741 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config\") pod \"33790736-04eb-4bb1-b17f-512e8815f939\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.573786 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc\") pod \"33790736-04eb-4bb1-b17f-512e8815f939\" (UID: \"33790736-04eb-4bb1-b17f-512e8815f939\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.574162 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9wz\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-kube-api-access-sn9wz\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.574243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-lock\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.574290 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.574361 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.574481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-cache\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.575927 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-cache\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.584757 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33790736-04eb-4bb1-b17f-512e8815f939" (UID: "33790736-04eb-4bb1-b17f-512e8815f939"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.585245 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1107f41-4b1d-4531-91cc-329f8ba26bea-lock\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: E0929 10:00:02.585489 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:00:02 crc kubenswrapper[4922]: E0929 10:00:02.585507 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:00:02 crc kubenswrapper[4922]: E0929 10:00:02.585560 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift podName:a1107f41-4b1d-4531-91cc-329f8ba26bea nodeName:}" failed. No retries permitted until 2025-09-29 10:00:03.085540213 +0000 UTC m=+928.451770477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift") pod "swift-storage-0" (UID: "a1107f41-4b1d-4531-91cc-329f8ba26bea") : configmap "swift-ring-files" not found Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.586958 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.588039 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config" (OuterVolumeSpecName: "config") pod "33790736-04eb-4bb1-b17f-512e8815f939" (UID: "33790736-04eb-4bb1-b17f-512e8815f939"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.588956 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33790736-04eb-4bb1-b17f-512e8815f939" (UID: "33790736-04eb-4bb1-b17f-512e8815f939"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.602753 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv" (OuterVolumeSpecName: "kube-api-access-6frhv") pod "33790736-04eb-4bb1-b17f-512e8815f939" (UID: "33790736-04eb-4bb1-b17f-512e8815f939"). InnerVolumeSpecName "kube-api-access-6frhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.615213 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.615872 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9wz\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-kube-api-access-sn9wz\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.675594 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.675627 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.675636 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33790736-04eb-4bb1-b17f-512e8815f939-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.675646 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frhv\" (UniqueName: \"kubernetes.io/projected/33790736-04eb-4bb1-b17f-512e8815f939-kube-api-access-6frhv\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.792900 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.795443 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.812415 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.814616 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.814801 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.817562 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.817721 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rh8wc" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.882786 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-config\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.882952 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lblb\" (UniqueName: \"kubernetes.io/projected/6b7e7da6-14cb-4046-b71d-8039326ca601-kube-api-access-5lblb\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883029 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883063 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-scripts\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.883816 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.979662 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wkc24"] Sep 29 10:00:02 crc kubenswrapper[4922]: E0929 10:00:02.980175 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ffb184-927a-4812-93a5-343a35a86cbf" containerName="init" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.980200 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ffb184-927a-4812-93a5-343a35a86cbf" containerName="init" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.980382 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ffb184-927a-4812-93a5-343a35a86cbf" containerName="init" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.980998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.983891 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.985660 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.986652 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87295\" (UniqueName: \"kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295\") pod \"95ffb184-927a-4812-93a5-343a35a86cbf\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.986938 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb\") pod \"95ffb184-927a-4812-93a5-343a35a86cbf\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.987135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config\") pod \"95ffb184-927a-4812-93a5-343a35a86cbf\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.987341 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc\") pod \"95ffb184-927a-4812-93a5-343a35a86cbf\" (UID: \"95ffb184-927a-4812-93a5-343a35a86cbf\") " Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.987707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-scripts\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.987875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lblb\" (UniqueName: \"kubernetes.io/projected/6b7e7da6-14cb-4046-b71d-8039326ca601-kube-api-access-5lblb\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.987983 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-config\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.988096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.988220 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.988337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.988529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.989384 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.994869 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-config\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.995118 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.996002 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b7e7da6-14cb-4046-b71d-8039326ca601-scripts\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:02 crc kubenswrapper[4922]: I0929 10:00:02.998354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.002706 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295" (OuterVolumeSpecName: "kube-api-access-87295") pod "95ffb184-927a-4812-93a5-343a35a86cbf" (UID: "95ffb184-927a-4812-93a5-343a35a86cbf"). InnerVolumeSpecName "kube-api-access-87295". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.007521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.008813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b7e7da6-14cb-4046-b71d-8039326ca601-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.023793 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wkc24"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.026990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lblb\" (UniqueName: \"kubernetes.io/projected/6b7e7da6-14cb-4046-b71d-8039326ca601-kube-api-access-5lblb\") pod \"ovn-northd-0\" (UID: \"6b7e7da6-14cb-4046-b71d-8039326ca601\") " pod="openstack/ovn-northd-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.042888 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config" (OuterVolumeSpecName: "config") pod "95ffb184-927a-4812-93a5-343a35a86cbf" (UID: "95ffb184-927a-4812-93a5-343a35a86cbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.042952 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95ffb184-927a-4812-93a5-343a35a86cbf" (UID: "95ffb184-927a-4812-93a5-343a35a86cbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: E0929 10:00:03.045138 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fjl4h ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fjl4h ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-wkc24" podUID="cb5207ca-27f9-4291-ada6-dee183994b0d" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.060962 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tvgqs"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.066685 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.067533 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95ffb184-927a-4812-93a5-343a35a86cbf" (UID: "95ffb184-927a-4812-93a5-343a35a86cbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090081 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090149 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090187 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090395 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjl4h\" (UniqueName: \"kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: E0929 10:00:03.090447 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:00:03 crc kubenswrapper[4922]: E0929 10:00:03.090484 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: E0929 10:00:03.090548 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift podName:a1107f41-4b1d-4531-91cc-329f8ba26bea nodeName:}" failed. No retries permitted until 2025-09-29 10:00:04.090526633 +0000 UTC m=+929.456756947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift") pod "swift-storage-0" (UID: "a1107f41-4b1d-4531-91cc-329f8ba26bea") : configmap "swift-ring-files" not found Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090580 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090650 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090677 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090737 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090752 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090763 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95ffb184-927a-4812-93a5-343a35a86cbf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.090773 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87295\" (UniqueName: \"kubernetes.io/projected/95ffb184-927a-4812-93a5-343a35a86cbf-kube-api-access-87295\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.095971 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tvgqs"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.102935 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wkc24"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.182998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.192698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193146 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vjv\" (UniqueName: \"kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193299 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193403 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4h\" (UniqueName: \"kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193861 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.193967 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.194119 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.194255 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.194357 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.194453 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.195169 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.195504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.196342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.196514 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.199210 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.201466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.216136 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjl4h\" (UniqueName: \"kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h\") pod \"swift-ring-rebalance-wkc24\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.296398 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vjv\" (UniqueName: \"kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.296984 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.297019 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.297056 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.297163 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.297211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.297241 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.298198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.298393 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.298568 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.305599 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.306521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.310690 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.321785 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vjv\" (UniqueName: \"kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv\") pod \"swift-ring-rebalance-tvgqs\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.392430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.564951 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" event={"ID":"95ffb184-927a-4812-93a5-343a35a86cbf","Type":"ContainerDied","Data":"fcae2883642512f43979e233e8643d610d0b8c3e91aa17292d09cbc521a5b82c"} Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.565019 4922 scope.go:117] "RemoveContainer" containerID="803cc67e1ab3eb92e74b9c8b48b46928bf1c25318b5e312c9e778d53abcbcb72" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.565075 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tzngn" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.583048 4922 generic.go:334] "Generic (PLEG): container finished" podID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerID="1b8f692fdaee6b5ccf2bb9ec973d28b87c59fd55d9fb231d5002149484ede311" exitCode=0 Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.584330 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8nm2t" event={"ID":"3b99b68d-2f67-466e-88af-d60bc5d9d283","Type":"ContainerDied","Data":"1b8f692fdaee6b5ccf2bb9ec973d28b87c59fd55d9fb231d5002149484ede311"} Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.584362 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8nm2t" event={"ID":"3b99b68d-2f67-466e-88af-d60bc5d9d283","Type":"ContainerStarted","Data":"c27f3edc2af2c051666dbaa912c3f7d6789eff3c41cf4ee1a131bbd5185165c9"} Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.584850 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.584965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-pk48l" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.635763 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.717167 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.755030 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.778058 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tzngn"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.801239 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-pk48l"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.808800 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-pk48l"] Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.823767 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.823862 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.823892 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.823929 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjl4h\" (UniqueName: \"kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.823996 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.824039 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.824073 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts\") pod \"cb5207ca-27f9-4291-ada6-dee183994b0d\" (UID: \"cb5207ca-27f9-4291-ada6-dee183994b0d\") " Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.824797 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts" (OuterVolumeSpecName: "scripts") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.828583 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.829176 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.832986 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.833586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h" (OuterVolumeSpecName: "kube-api-access-fjl4h") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "kube-api-access-fjl4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.834508 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.834541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb5207ca-27f9-4291-ada6-dee183994b0d" (UID: "cb5207ca-27f9-4291-ada6-dee183994b0d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.926033 4922 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927626 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927646 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjl4h\" (UniqueName: \"kubernetes.io/projected/cb5207ca-27f9-4291-ada6-dee183994b0d-kube-api-access-fjl4h\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927717 4922 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927754 4922 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb5207ca-27f9-4291-ada6-dee183994b0d-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927765 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb5207ca-27f9-4291-ada6-dee183994b0d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.927774 4922 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb5207ca-27f9-4291-ada6-dee183994b0d-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:03 crc kubenswrapper[4922]: I0929 10:00:03.988950 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.132038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hscn\" (UniqueName: \"kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn\") pod \"6ae87e51-bfce-41e2-b41a-327df982e7aa\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.132123 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume\") pod \"6ae87e51-bfce-41e2-b41a-327df982e7aa\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.132432 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume\") pod \"6ae87e51-bfce-41e2-b41a-327df982e7aa\" (UID: \"6ae87e51-bfce-41e2-b41a-327df982e7aa\") " Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.132837 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:04 crc kubenswrapper[4922]: E0929 10:00:04.133104 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:00:04 crc kubenswrapper[4922]: E0929 10:00:04.133170 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.133120 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ae87e51-bfce-41e2-b41a-327df982e7aa" (UID: "6ae87e51-bfce-41e2-b41a-327df982e7aa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:04 crc kubenswrapper[4922]: E0929 10:00:04.133252 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift podName:a1107f41-4b1d-4531-91cc-329f8ba26bea nodeName:}" failed. No retries permitted until 2025-09-29 10:00:06.133224927 +0000 UTC m=+931.499455211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift") pod "swift-storage-0" (UID: "a1107f41-4b1d-4531-91cc-329f8ba26bea") : configmap "swift-ring-files" not found Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.137330 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ae87e51-bfce-41e2-b41a-327df982e7aa" (UID: "6ae87e51-bfce-41e2-b41a-327df982e7aa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.137970 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn" (OuterVolumeSpecName: "kube-api-access-7hscn") pod "6ae87e51-bfce-41e2-b41a-327df982e7aa" (UID: "6ae87e51-bfce-41e2-b41a-327df982e7aa"). InnerVolumeSpecName "kube-api-access-7hscn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.173969 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tvgqs"] Sep 29 10:00:04 crc kubenswrapper[4922]: W0929 10:00:04.176736 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod396dcf64_c14b_4e56_9533_dbadbfac272a.slice/crio-d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224 WatchSource:0}: Error finding container d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224: Status 404 returned error can't find the container with id d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224 Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.237108 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hscn\" (UniqueName: \"kubernetes.io/projected/6ae87e51-bfce-41e2-b41a-327df982e7aa-kube-api-access-7hscn\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.237150 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ae87e51-bfce-41e2-b41a-327df982e7aa-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.237161 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ae87e51-bfce-41e2-b41a-327df982e7aa-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.594099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6b7e7da6-14cb-4046-b71d-8039326ca601","Type":"ContainerStarted","Data":"5de590d52050088d2a0609b6a3c3af9138a23281a6d51d2ca3be50029f68c6e7"} Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.598417 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8nm2t" event={"ID":"3b99b68d-2f67-466e-88af-d60bc5d9d283","Type":"ContainerStarted","Data":"acd183b54cfcc96088f01e7ef8f8f908c8a88e94de3bb6fcb1744c4614944f03"} Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.598565 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.600823 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.600847 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9" event={"ID":"6ae87e51-bfce-41e2-b41a-327df982e7aa","Type":"ContainerDied","Data":"23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf"} Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.600892 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f96b5b873149911adf0ab6363c927f3f2526349c89fe7cf346c8fb2b29a6cf" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.603353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tvgqs" event={"ID":"396dcf64-c14b-4e56-9533-dbadbfac272a","Type":"ContainerStarted","Data":"d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224"} Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.604394 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wkc24" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.626787 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-8nm2t" podStartSLOduration=3.626759027 podStartE2EDuration="3.626759027s" podCreationTimestamp="2025-09-29 10:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:04.620481987 +0000 UTC m=+929.986712261" watchObservedRunningTime="2025-09-29 10:00:04.626759027 +0000 UTC m=+929.992989291" Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.670980 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wkc24"] Sep 29 10:00:04 crc kubenswrapper[4922]: I0929 10:00:04.677201 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wkc24"] Sep 29 10:00:05 crc kubenswrapper[4922]: I0929 10:00:05.480796 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33790736-04eb-4bb1-b17f-512e8815f939" path="/var/lib/kubelet/pods/33790736-04eb-4bb1-b17f-512e8815f939/volumes" Sep 29 10:00:05 crc kubenswrapper[4922]: I0929 10:00:05.486556 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ffb184-927a-4812-93a5-343a35a86cbf" path="/var/lib/kubelet/pods/95ffb184-927a-4812-93a5-343a35a86cbf/volumes" Sep 29 10:00:05 crc kubenswrapper[4922]: I0929 10:00:05.487969 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5207ca-27f9-4291-ada6-dee183994b0d" path="/var/lib/kubelet/pods/cb5207ca-27f9-4291-ada6-dee183994b0d/volumes" Sep 29 10:00:06 crc kubenswrapper[4922]: I0929 10:00:06.176686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:06 crc kubenswrapper[4922]: E0929 10:00:06.176889 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:00:06 crc kubenswrapper[4922]: E0929 10:00:06.176939 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:00:06 crc kubenswrapper[4922]: E0929 10:00:06.177006 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift podName:a1107f41-4b1d-4531-91cc-329f8ba26bea nodeName:}" failed. No retries permitted until 2025-09-29 10:00:10.176984309 +0000 UTC m=+935.543214573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift") pod "swift-storage-0" (UID: "a1107f41-4b1d-4531-91cc-329f8ba26bea") : configmap "swift-ring-files" not found Sep 29 10:00:06 crc kubenswrapper[4922]: I0929 10:00:06.626860 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6b7e7da6-14cb-4046-b71d-8039326ca601","Type":"ContainerStarted","Data":"5e5852b8f504fad6ab1442a49083d5619559065b0399202ebd5cea961afee5cd"} Sep 29 10:00:07 crc kubenswrapper[4922]: I0929 10:00:07.720591 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 29 10:00:07 crc kubenswrapper[4922]: I0929 10:00:07.721186 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 29 10:00:07 crc kubenswrapper[4922]: I0929 10:00:07.790955 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 29 10:00:08 crc kubenswrapper[4922]: I0929 10:00:08.646434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6b7e7da6-14cb-4046-b71d-8039326ca601","Type":"ContainerStarted","Data":"d6a618f0d39633cb9bc0093ff440e2e4051d95f7bbfed7eeb0007ef04a485459"} Sep 29 10:00:08 crc kubenswrapper[4922]: I0929 10:00:08.647718 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 29 10:00:08 crc kubenswrapper[4922]: I0929 10:00:08.681503 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.845386549 podStartE2EDuration="6.681480563s" podCreationTimestamp="2025-09-29 10:00:02 +0000 UTC" firstStartedPulling="2025-09-29 10:00:03.664394196 +0000 UTC m=+929.030624460" lastFinishedPulling="2025-09-29 10:00:05.50048821 +0000 UTC m=+930.866718474" observedRunningTime="2025-09-29 10:00:08.668102361 +0000 UTC m=+934.034332635" watchObservedRunningTime="2025-09-29 10:00:08.681480563 +0000 UTC m=+934.047710837" Sep 29 10:00:08 crc kubenswrapper[4922]: I0929 10:00:08.716198 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.019896 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.019952 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.025808 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gjlts"] Sep 29 10:00:09 crc kubenswrapper[4922]: E0929 10:00:09.026374 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae87e51-bfce-41e2-b41a-327df982e7aa" containerName="collect-profiles" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.026443 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae87e51-bfce-41e2-b41a-327df982e7aa" containerName="collect-profiles" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.026675 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae87e51-bfce-41e2-b41a-327df982e7aa" containerName="collect-profiles" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.032773 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gjlts" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.064899 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gjlts"] Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.151506 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xklv\" (UniqueName: \"kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv\") pod \"placement-db-create-gjlts\" (UID: \"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac\") " pod="openstack/placement-db-create-gjlts" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.213990 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.253076 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xklv\" (UniqueName: \"kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv\") pod \"placement-db-create-gjlts\" (UID: \"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac\") " pod="openstack/placement-db-create-gjlts" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.274605 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xklv\" (UniqueName: \"kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv\") pod \"placement-db-create-gjlts\" (UID: \"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac\") " pod="openstack/placement-db-create-gjlts" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.314434 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lsq42"] Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.316203 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsq42" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.331003 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lsq42"] Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.377086 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gjlts" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.456736 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j4jf\" (UniqueName: \"kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf\") pod \"glance-db-create-lsq42\" (UID: \"18063e86-56aa-470c-a41d-d7965d242a20\") " pod="openstack/glance-db-create-lsq42" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.558393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j4jf\" (UniqueName: \"kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf\") pod \"glance-db-create-lsq42\" (UID: \"18063e86-56aa-470c-a41d-d7965d242a20\") " pod="openstack/glance-db-create-lsq42" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.593820 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j4jf\" (UniqueName: \"kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf\") pod \"glance-db-create-lsq42\" (UID: \"18063e86-56aa-470c-a41d-d7965d242a20\") " pod="openstack/glance-db-create-lsq42" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.636955 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsq42" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.668602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tvgqs" event={"ID":"396dcf64-c14b-4e56-9533-dbadbfac272a","Type":"ContainerStarted","Data":"b6f68e91ee9ed5dcc34b3a4684c8f2d06a8b2f442803e364dfdf4b47012dd030"} Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.696961 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tvgqs" podStartSLOduration=2.363080492 podStartE2EDuration="6.69694198s" podCreationTimestamp="2025-09-29 10:00:03 +0000 UTC" firstStartedPulling="2025-09-29 10:00:04.179145339 +0000 UTC m=+929.545375613" lastFinishedPulling="2025-09-29 10:00:08.513006837 +0000 UTC m=+933.879237101" observedRunningTime="2025-09-29 10:00:09.68989572 +0000 UTC m=+935.056125994" watchObservedRunningTime="2025-09-29 10:00:09.69694198 +0000 UTC m=+935.063172244" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.742105 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.869081 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gjlts"] Sep 29 10:00:09 crc kubenswrapper[4922]: W0929 10:00:09.880904 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff184c2c_608a_4711_a2fe_5c8ffe2d64ac.slice/crio-8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2 WatchSource:0}: Error finding container 8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2: Status 404 returned error can't find the container with id 8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2 Sep 29 10:00:09 crc kubenswrapper[4922]: I0929 10:00:09.941095 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lsq42"] Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.270794 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:10 crc kubenswrapper[4922]: E0929 10:00:10.271139 4922 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 10:00:10 crc kubenswrapper[4922]: E0929 10:00:10.271231 4922 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 10:00:10 crc kubenswrapper[4922]: E0929 10:00:10.271296 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift podName:a1107f41-4b1d-4531-91cc-329f8ba26bea nodeName:}" failed. No retries permitted until 2025-09-29 10:00:18.271278256 +0000 UTC m=+943.637508510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift") pod "swift-storage-0" (UID: "a1107f41-4b1d-4531-91cc-329f8ba26bea") : configmap "swift-ring-files" not found Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.680671 4922 generic.go:334] "Generic (PLEG): container finished" podID="18063e86-56aa-470c-a41d-d7965d242a20" containerID="f94cf50a1f1bd578e71cb01ddce02d66dfb7793da43fa42ee39478637362a8a0" exitCode=0 Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.680747 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsq42" event={"ID":"18063e86-56aa-470c-a41d-d7965d242a20","Type":"ContainerDied","Data":"f94cf50a1f1bd578e71cb01ddce02d66dfb7793da43fa42ee39478637362a8a0"} Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.681039 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsq42" event={"ID":"18063e86-56aa-470c-a41d-d7965d242a20","Type":"ContainerStarted","Data":"a398baf6133db1f2a10df49356dff57569e22c7322b876e97f694b4d47dbbcef"} Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.683341 4922 generic.go:334] "Generic (PLEG): container finished" podID="ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" containerID="00a5e1c9248f76ae152bbd86fedb3abea340b22063905e555de49e63678f20ae" exitCode=0 Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.683459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gjlts" event={"ID":"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac","Type":"ContainerDied","Data":"00a5e1c9248f76ae152bbd86fedb3abea340b22063905e555de49e63678f20ae"} Sep 29 10:00:10 crc kubenswrapper[4922]: I0929 10:00:10.683480 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gjlts" event={"ID":"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac","Type":"ContainerStarted","Data":"8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2"} Sep 29 10:00:11 crc kubenswrapper[4922]: I0929 10:00:11.927217 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:11 crc kubenswrapper[4922]: I0929 10:00:11.995060 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 10:00:11 crc kubenswrapper[4922]: I0929 10:00:11.995626 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="dnsmasq-dns" containerID="cri-o://268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4" gracePeriod=10 Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.158587 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gjlts" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.164113 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsq42" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.219097 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j4jf\" (UniqueName: \"kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf\") pod \"18063e86-56aa-470c-a41d-d7965d242a20\" (UID: \"18063e86-56aa-470c-a41d-d7965d242a20\") " Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.219287 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xklv\" (UniqueName: \"kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv\") pod \"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac\" (UID: \"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac\") " Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.227892 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv" (OuterVolumeSpecName: "kube-api-access-5xklv") pod "ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" (UID: "ff184c2c-608a-4711-a2fe-5c8ffe2d64ac"). InnerVolumeSpecName "kube-api-access-5xklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.231254 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf" (OuterVolumeSpecName: "kube-api-access-2j4jf") pod "18063e86-56aa-470c-a41d-d7965d242a20" (UID: "18063e86-56aa-470c-a41d-d7965d242a20"). InnerVolumeSpecName "kube-api-access-2j4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.322674 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xklv\" (UniqueName: \"kubernetes.io/projected/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac-kube-api-access-5xklv\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.322715 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j4jf\" (UniqueName: \"kubernetes.io/projected/18063e86-56aa-470c-a41d-d7965d242a20-kube-api-access-2j4jf\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.467745 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.526189 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config\") pod \"2026402b-4401-489a-9f34-264a57ec2501\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.526774 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpk5g\" (UniqueName: \"kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g\") pod \"2026402b-4401-489a-9f34-264a57ec2501\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.526854 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc\") pod \"2026402b-4401-489a-9f34-264a57ec2501\" (UID: \"2026402b-4401-489a-9f34-264a57ec2501\") " Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.543451 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g" (OuterVolumeSpecName: "kube-api-access-xpk5g") pod "2026402b-4401-489a-9f34-264a57ec2501" (UID: "2026402b-4401-489a-9f34-264a57ec2501"). InnerVolumeSpecName "kube-api-access-xpk5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.573727 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2026402b-4401-489a-9f34-264a57ec2501" (UID: "2026402b-4401-489a-9f34-264a57ec2501"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.587210 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config" (OuterVolumeSpecName: "config") pod "2026402b-4401-489a-9f34-264a57ec2501" (UID: "2026402b-4401-489a-9f34-264a57ec2501"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.629163 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.629203 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpk5g\" (UniqueName: \"kubernetes.io/projected/2026402b-4401-489a-9f34-264a57ec2501-kube-api-access-xpk5g\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.629216 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2026402b-4401-489a-9f34-264a57ec2501-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.701963 4922 generic.go:334] "Generic (PLEG): container finished" podID="2026402b-4401-489a-9f34-264a57ec2501" containerID="268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4" exitCode=0 Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.702035 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.702062 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" event={"ID":"2026402b-4401-489a-9f34-264a57ec2501","Type":"ContainerDied","Data":"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4"} Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.702136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vndlk" event={"ID":"2026402b-4401-489a-9f34-264a57ec2501","Type":"ContainerDied","Data":"0527ffdf9fbc6dc23ff4f6f3b835a5cdd6860e8215b09792874e13e1007c00ee"} Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.702185 4922 scope.go:117] "RemoveContainer" containerID="268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.703892 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lsq42" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.703952 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lsq42" event={"ID":"18063e86-56aa-470c-a41d-d7965d242a20","Type":"ContainerDied","Data":"a398baf6133db1f2a10df49356dff57569e22c7322b876e97f694b4d47dbbcef"} Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.703983 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a398baf6133db1f2a10df49356dff57569e22c7322b876e97f694b4d47dbbcef" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.713336 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gjlts" event={"ID":"ff184c2c-608a-4711-a2fe-5c8ffe2d64ac","Type":"ContainerDied","Data":"8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2"} Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.713391 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8206cac9cbdda150e60b4a13f677f77242f3aa2da41403e1dd0246c4d65a82b2" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.713504 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gjlts" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.727234 4922 scope.go:117] "RemoveContainer" containerID="e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.757141 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.761773 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vndlk"] Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.769343 4922 scope.go:117] "RemoveContainer" containerID="268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4" Sep 29 10:00:12 crc kubenswrapper[4922]: E0929 10:00:12.773351 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4\": container with ID starting with 268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4 not found: ID does not exist" containerID="268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.773417 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4"} err="failed to get container status \"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4\": rpc error: code = NotFound desc = could not find container \"268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4\": container with ID starting with 268c5b563880289c2960574ae37992c8e7abf2466157e9f11a631597ab68adf4 not found: ID does not exist" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.773463 4922 scope.go:117] "RemoveContainer" containerID="e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807" Sep 29 10:00:12 crc kubenswrapper[4922]: E0929 10:00:12.773844 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807\": container with ID starting with e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807 not found: ID does not exist" containerID="e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807" Sep 29 10:00:12 crc kubenswrapper[4922]: I0929 10:00:12.773878 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807"} err="failed to get container status \"e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807\": rpc error: code = NotFound desc = could not find container \"e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807\": container with ID starting with e1676b4eefb9a2defbb5052e79e3c270afc0d0ce77cbb453dfb8515cad5dc807 not found: ID does not exist" Sep 29 10:00:13 crc kubenswrapper[4922]: I0929 10:00:13.476698 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2026402b-4401-489a-9f34-264a57ec2501" path="/var/lib/kubelet/pods/2026402b-4401-489a-9f34-264a57ec2501/volumes" Sep 29 10:00:16 crc kubenswrapper[4922]: I0929 10:00:16.757526 4922 generic.go:334] "Generic (PLEG): container finished" podID="396dcf64-c14b-4e56-9533-dbadbfac272a" containerID="b6f68e91ee9ed5dcc34b3a4684c8f2d06a8b2f442803e364dfdf4b47012dd030" exitCode=0 Sep 29 10:00:16 crc kubenswrapper[4922]: I0929 10:00:16.757631 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tvgqs" event={"ID":"396dcf64-c14b-4e56-9533-dbadbfac272a","Type":"ContainerDied","Data":"b6f68e91ee9ed5dcc34b3a4684c8f2d06a8b2f442803e364dfdf4b47012dd030"} Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.119514 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247461 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247602 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247718 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247768 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247822 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.247987 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.248031 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vjv\" (UniqueName: \"kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv\") pod \"396dcf64-c14b-4e56-9533-dbadbfac272a\" (UID: \"396dcf64-c14b-4e56-9533-dbadbfac272a\") " Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.249446 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.249900 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.255411 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv" (OuterVolumeSpecName: "kube-api-access-77vjv") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "kube-api-access-77vjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.260625 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.272661 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.279923 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.282663 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.287039 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts" (OuterVolumeSpecName: "scripts") pod "396dcf64-c14b-4e56-9533-dbadbfac272a" (UID: "396dcf64-c14b-4e56-9533-dbadbfac272a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350845 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350863 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vjv\" (UniqueName: \"kubernetes.io/projected/396dcf64-c14b-4e56-9533-dbadbfac272a-kube-api-access-77vjv\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350877 4922 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350890 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350904 4922 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/396dcf64-c14b-4e56-9533-dbadbfac272a-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350914 4922 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/396dcf64-c14b-4e56-9533-dbadbfac272a-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.350926 4922 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/396dcf64-c14b-4e56-9533-dbadbfac272a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.391244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1107f41-4b1d-4531-91cc-329f8ba26bea-etc-swift\") pod \"swift-storage-0\" (UID: \"a1107f41-4b1d-4531-91cc-329f8ba26bea\") " pod="openstack/swift-storage-0" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.623925 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-k5qmh"] Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.624806 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18063e86-56aa-470c-a41d-d7965d242a20" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.624848 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="18063e86-56aa-470c-a41d-d7965d242a20" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.624859 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="dnsmasq-dns" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.624867 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="dnsmasq-dns" Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.624898 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.624906 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.624920 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="init" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.624926 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="init" Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.624946 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396dcf64-c14b-4e56-9533-dbadbfac272a" containerName="swift-ring-rebalance" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.624954 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="396dcf64-c14b-4e56-9533-dbadbfac272a" containerName="swift-ring-rebalance" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.625179 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="18063e86-56aa-470c-a41d-d7965d242a20" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.625198 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2026402b-4401-489a-9f34-264a57ec2501" containerName="dnsmasq-dns" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.625212 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="396dcf64-c14b-4e56-9533-dbadbfac272a" containerName="swift-ring-rebalance" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.625223 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" containerName="mariadb-database-create" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.626017 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.631205 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k5qmh"] Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.654453 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.757663 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lt97\" (UniqueName: \"kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97\") pod \"keystone-db-create-k5qmh\" (UID: \"8f79e4ba-2854-4d57-9d76-391f020c62ce\") " pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.807326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tvgqs" event={"ID":"396dcf64-c14b-4e56-9533-dbadbfac272a","Type":"ContainerDied","Data":"d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224"} Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.807408 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88b9cdcc941594ee86d24daf17dce8eb997701e4a9ad1d8fb0220e5d38c2224" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.807590 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tvgqs" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.860052 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lt97\" (UniqueName: \"kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97\") pod \"keystone-db-create-k5qmh\" (UID: \"8f79e4ba-2854-4d57-9d76-391f020c62ce\") " pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.882797 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lt97\" (UniqueName: \"kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97\") pod \"keystone-db-create-k5qmh\" (UID: \"8f79e4ba-2854-4d57-9d76-391f020c62ce\") " pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.955456 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:18 crc kubenswrapper[4922]: E0929 10:00:18.966117 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod396dcf64_c14b_4e56_9533_dbadbfac272a.slice\": RecentStats: unable to find data in memory cache]" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.987694 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b8ce-account-create-5lq5r"] Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.989682 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.993818 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 29 10:00:18 crc kubenswrapper[4922]: I0929 10:00:18.998611 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b8ce-account-create-5lq5r"] Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.067681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspff\" (UniqueName: \"kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff\") pod \"placement-b8ce-account-create-5lq5r\" (UID: \"9b58c075-8b99-4903-8e23-973f208b4edd\") " pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.170762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lspff\" (UniqueName: \"kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff\") pod \"placement-b8ce-account-create-5lq5r\" (UID: \"9b58c075-8b99-4903-8e23-973f208b4edd\") " pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.203358 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspff\" (UniqueName: \"kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff\") pod \"placement-b8ce-account-create-5lq5r\" (UID: \"9b58c075-8b99-4903-8e23-973f208b4edd\") " pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.269447 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 10:00:19 crc kubenswrapper[4922]: W0929 10:00:19.275219 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1107f41_4b1d_4531_91cc_329f8ba26bea.slice/crio-64bac12ef49d56a522ab644e71e848dd1f4ae8390c881eca36d590e5bc53b668 WatchSource:0}: Error finding container 64bac12ef49d56a522ab644e71e848dd1f4ae8390c881eca36d590e5bc53b668: Status 404 returned error can't find the container with id 64bac12ef49d56a522ab644e71e848dd1f4ae8390c881eca36d590e5bc53b668 Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.313538 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.438995 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-980b-account-create-z5h6z"] Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.440446 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.444749 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.466158 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-980b-account-create-z5h6z"] Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.477110 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbllm\" (UniqueName: \"kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm\") pod \"glance-980b-account-create-z5h6z\" (UID: \"62832609-e9cd-41a5-a77e-4fdbe35cd12e\") " pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:19 crc kubenswrapper[4922]: W0929 10:00:19.481105 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f79e4ba_2854_4d57_9d76_391f020c62ce.slice/crio-337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2 WatchSource:0}: Error finding container 337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2: Status 404 returned error can't find the container with id 337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2 Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.483504 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k5qmh"] Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.579448 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbllm\" (UniqueName: \"kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm\") pod \"glance-980b-account-create-z5h6z\" (UID: \"62832609-e9cd-41a5-a77e-4fdbe35cd12e\") " pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.581598 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b8ce-account-create-5lq5r"] Sep 29 10:00:19 crc kubenswrapper[4922]: W0929 10:00:19.599005 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b58c075_8b99_4903_8e23_973f208b4edd.slice/crio-e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c WatchSource:0}: Error finding container e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c: Status 404 returned error can't find the container with id e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.603652 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbllm\" (UniqueName: \"kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm\") pod \"glance-980b-account-create-z5h6z\" (UID: \"62832609-e9cd-41a5-a77e-4fdbe35cd12e\") " pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.767501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.824400 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k5qmh" event={"ID":"8f79e4ba-2854-4d57-9d76-391f020c62ce","Type":"ContainerStarted","Data":"337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2"} Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.828991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"64bac12ef49d56a522ab644e71e848dd1f4ae8390c881eca36d590e5bc53b668"} Sep 29 10:00:19 crc kubenswrapper[4922]: I0929 10:00:19.832955 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8ce-account-create-5lq5r" event={"ID":"9b58c075-8b99-4903-8e23-973f208b4edd","Type":"ContainerStarted","Data":"e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c"} Sep 29 10:00:20 crc kubenswrapper[4922]: I0929 10:00:20.244745 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-980b-account-create-z5h6z"] Sep 29 10:00:20 crc kubenswrapper[4922]: I0929 10:00:20.847674 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-980b-account-create-z5h6z" event={"ID":"62832609-e9cd-41a5-a77e-4fdbe35cd12e","Type":"ContainerStarted","Data":"cbcd94ee95672ace316804aa608c9e0623b22b692c139595bd3e8616001e61ae"} Sep 29 10:00:21 crc kubenswrapper[4922]: I0929 10:00:21.863914 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k5qmh" event={"ID":"8f79e4ba-2854-4d57-9d76-391f020c62ce","Type":"ContainerStarted","Data":"7b845b20c851b4631ce7f0d840910c73eab887da9fb50ef8a3ae28bf683bb115"} Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.877272 4922 generic.go:334] "Generic (PLEG): container finished" podID="8f79e4ba-2854-4d57-9d76-391f020c62ce" containerID="7b845b20c851b4631ce7f0d840910c73eab887da9fb50ef8a3ae28bf683bb115" exitCode=0 Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.877406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k5qmh" event={"ID":"8f79e4ba-2854-4d57-9d76-391f020c62ce","Type":"ContainerDied","Data":"7b845b20c851b4631ce7f0d840910c73eab887da9fb50ef8a3ae28bf683bb115"} Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.882560 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b58c075-8b99-4903-8e23-973f208b4edd" containerID="478fac433f081b6cf14a95627c1dc9eadcf24b141f8c90e7d09ca36e07bdd72c" exitCode=0 Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.882640 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8ce-account-create-5lq5r" event={"ID":"9b58c075-8b99-4903-8e23-973f208b4edd","Type":"ContainerDied","Data":"478fac433f081b6cf14a95627c1dc9eadcf24b141f8c90e7d09ca36e07bdd72c"} Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.885174 4922 generic.go:334] "Generic (PLEG): container finished" podID="62832609-e9cd-41a5-a77e-4fdbe35cd12e" containerID="b612715a436f5665726ee3a58544af6b5dc218ef8f48ad388d5331f9138ef0ad" exitCode=0 Sep 29 10:00:22 crc kubenswrapper[4922]: I0929 10:00:22.885221 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-980b-account-create-z5h6z" event={"ID":"62832609-e9cd-41a5-a77e-4fdbe35cd12e","Type":"ContainerDied","Data":"b612715a436f5665726ee3a58544af6b5dc218ef8f48ad388d5331f9138ef0ad"} Sep 29 10:00:23 crc kubenswrapper[4922]: I0929 10:00:23.901985 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"852a4e4bb7b811ef046218f2283f597dc3bf7ff96c91c27d5c3563e0c722c28a"} Sep 29 10:00:23 crc kubenswrapper[4922]: I0929 10:00:23.902095 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"513e9e6ee2ecbb8fd850c4b8b23d121856376d1465339055b992acfb45702878"} Sep 29 10:00:23 crc kubenswrapper[4922]: I0929 10:00:23.902131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"ceabcde9036da5e663ab845139d0f096f9fe1ee2c3f6060d67b9807056a47cb0"} Sep 29 10:00:23 crc kubenswrapper[4922]: I0929 10:00:23.902157 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"28523a0fde0581edaf2ba294d9bed14b37dbfa5e57b9f9e378b971805bb47ea8"} Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.313593 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.410209 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbllm\" (UniqueName: \"kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm\") pod \"62832609-e9cd-41a5-a77e-4fdbe35cd12e\" (UID: \"62832609-e9cd-41a5-a77e-4fdbe35cd12e\") " Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.416730 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm" (OuterVolumeSpecName: "kube-api-access-gbllm") pod "62832609-e9cd-41a5-a77e-4fdbe35cd12e" (UID: "62832609-e9cd-41a5-a77e-4fdbe35cd12e"). InnerVolumeSpecName "kube-api-access-gbllm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.455498 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.464013 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.512996 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbllm\" (UniqueName: \"kubernetes.io/projected/62832609-e9cd-41a5-a77e-4fdbe35cd12e-kube-api-access-gbllm\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.614273 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lt97\" (UniqueName: \"kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97\") pod \"8f79e4ba-2854-4d57-9d76-391f020c62ce\" (UID: \"8f79e4ba-2854-4d57-9d76-391f020c62ce\") " Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.614402 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lspff\" (UniqueName: \"kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff\") pod \"9b58c075-8b99-4903-8e23-973f208b4edd\" (UID: \"9b58c075-8b99-4903-8e23-973f208b4edd\") " Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.619579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff" (OuterVolumeSpecName: "kube-api-access-lspff") pod "9b58c075-8b99-4903-8e23-973f208b4edd" (UID: "9b58c075-8b99-4903-8e23-973f208b4edd"). InnerVolumeSpecName "kube-api-access-lspff". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.621698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97" (OuterVolumeSpecName: "kube-api-access-6lt97") pod "8f79e4ba-2854-4d57-9d76-391f020c62ce" (UID: "8f79e4ba-2854-4d57-9d76-391f020c62ce"). InnerVolumeSpecName "kube-api-access-6lt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.716507 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lt97\" (UniqueName: \"kubernetes.io/projected/8f79e4ba-2854-4d57-9d76-391f020c62ce-kube-api-access-6lt97\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.716564 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lspff\" (UniqueName: \"kubernetes.io/projected/9b58c075-8b99-4903-8e23-973f208b4edd-kube-api-access-lspff\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.865676 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6kqsg" podUID="12d2ae39-f918-485b-a8c4-b083cdf9d48f" containerName="ovn-controller" probeResult="failure" output=< Sep 29 10:00:24 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 29 10:00:24 crc kubenswrapper[4922]: > Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.923354 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-980b-account-create-z5h6z" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.923340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-980b-account-create-z5h6z" event={"ID":"62832609-e9cd-41a5-a77e-4fdbe35cd12e","Type":"ContainerDied","Data":"cbcd94ee95672ace316804aa608c9e0623b22b692c139595bd3e8616001e61ae"} Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.923570 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcd94ee95672ace316804aa608c9e0623b22b692c139595bd3e8616001e61ae" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.926114 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k5qmh" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.926132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k5qmh" event={"ID":"8f79e4ba-2854-4d57-9d76-391f020c62ce","Type":"ContainerDied","Data":"337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2"} Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.926185 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="337a1f265dcc0a73fba651171161875c0e1abfea4a2fcaeedbb61372db795ba2" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.934929 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"12728c1384952c16da6ed26c673fab1190add477fcdc440b8b0ca0c628938a27"} Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.950224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b8ce-account-create-5lq5r" event={"ID":"9b58c075-8b99-4903-8e23-973f208b4edd","Type":"ContainerDied","Data":"e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c"} Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.950303 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22a27b100d84fb9414b77d056122197b99672eaec0b215b2cbc7cf0e8a45d3c" Sep 29 10:00:24 crc kubenswrapper[4922]: I0929 10:00:24.950441 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b8ce-account-create-5lq5r" Sep 29 10:00:25 crc kubenswrapper[4922]: I0929 10:00:25.969102 4922 generic.go:334] "Generic (PLEG): container finished" podID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerID="c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0" exitCode=0 Sep 29 10:00:25 crc kubenswrapper[4922]: I0929 10:00:25.969255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerDied","Data":"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0"} Sep 29 10:00:25 crc kubenswrapper[4922]: I0929 10:00:25.972455 4922 generic.go:334] "Generic (PLEG): container finished" podID="3a51d044-d162-4938-8ca4-b4a200e78739" containerID="faadddd4d5d9c294d7d0d82cbdccb37186b92c157c6a6cbb4ca84753ab65f49a" exitCode=0 Sep 29 10:00:25 crc kubenswrapper[4922]: I0929 10:00:25.972551 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerDied","Data":"faadddd4d5d9c294d7d0d82cbdccb37186b92c157c6a6cbb4ca84753ab65f49a"} Sep 29 10:00:25 crc kubenswrapper[4922]: I0929 10:00:25.999426 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"ccd24e67b70ab7f9592d97a36d5192130cda1277ee794cb68cbda14c43a5a024"} Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.010567 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerStarted","Data":"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549"} Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.011193 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.012978 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerStarted","Data":"7df93ad0c009d4519aa6464d01fcf6e1224050a5ded4664b07d5a04cd1aad245"} Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.013167 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.018370 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"0eb1503f75a9d38cb551f42eddd2231fc367dba7763fc8f2f34fa757dfd780b9"} Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.018416 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"485ee84ca7feb8fa4b1e13d78508bfcc2b24206cdb2549133ffc3ae6c27f47eb"} Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.038581 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.315141469 podStartE2EDuration="1m3.038562107s" podCreationTimestamp="2025-09-29 09:59:24 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.304145051 +0000 UTC m=+906.670375315" lastFinishedPulling="2025-09-29 09:59:52.027565689 +0000 UTC m=+917.393795953" observedRunningTime="2025-09-29 10:00:27.036738807 +0000 UTC m=+952.402969071" watchObservedRunningTime="2025-09-29 10:00:27.038562107 +0000 UTC m=+952.404792361" Sep 29 10:00:27 crc kubenswrapper[4922]: I0929 10:00:27.067240 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.425487802 podStartE2EDuration="1m3.067216061s" podCreationTimestamp="2025-09-29 09:59:24 +0000 UTC" firstStartedPulling="2025-09-29 09:59:41.234938438 +0000 UTC m=+906.601168702" lastFinishedPulling="2025-09-29 09:59:52.876666697 +0000 UTC m=+918.242896961" observedRunningTime="2025-09-29 10:00:27.063598894 +0000 UTC m=+952.429829168" watchObservedRunningTime="2025-09-29 10:00:27.067216061 +0000 UTC m=+952.433446345" Sep 29 10:00:28 crc kubenswrapper[4922]: I0929 10:00:28.040988 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"409ecb9139bc6cbe050a43b54ee29b5407906558ef141775489f564539ee60a0"} Sep 29 10:00:28 crc kubenswrapper[4922]: I0929 10:00:28.041472 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"5821ce2bb351527fcc8fb16bb45168d5a1c272e9e1efeabf1fb43d137fd07711"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.068906 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"f5597acab8772c9535d4af918c5679568673b6a6130b4b4baa7e3c76170f169b"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.069333 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"2e6631c791388a2054142b4ccdca1428a6788cfd6978002af86fbeaca9ed0b35"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.069349 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"936ef1971f2b818c8a637d933008fd227aedfb14249f288db106e16d4f75bcf6"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.069359 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"e07abddbe305dace49d35175bd74f929af92103da6727b73e11748df52f07c07"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.069369 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1107f41-4b1d-4531-91cc-329f8ba26bea","Type":"ContainerStarted","Data":"134405596c3e3a69d994771a798979bd8a11b2fea70db67aa51fa98f0413b7a9"} Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.114082 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.811657264 podStartE2EDuration="28.114058857s" podCreationTimestamp="2025-09-29 10:00:01 +0000 UTC" firstStartedPulling="2025-09-29 10:00:19.277781684 +0000 UTC m=+944.644011948" lastFinishedPulling="2025-09-29 10:00:27.580183277 +0000 UTC m=+952.946413541" observedRunningTime="2025-09-29 10:00:29.106145652 +0000 UTC m=+954.472375936" watchObservedRunningTime="2025-09-29 10:00:29.114058857 +0000 UTC m=+954.480289121" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.399546 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:00:29 crc kubenswrapper[4922]: E0929 10:00:29.400403 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62832609-e9cd-41a5-a77e-4fdbe35cd12e" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.400482 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="62832609-e9cd-41a5-a77e-4fdbe35cd12e" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: E0929 10:00:29.400563 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f79e4ba-2854-4d57-9d76-391f020c62ce" containerName="mariadb-database-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.400618 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f79e4ba-2854-4d57-9d76-391f020c62ce" containerName="mariadb-database-create" Sep 29 10:00:29 crc kubenswrapper[4922]: E0929 10:00:29.400691 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b58c075-8b99-4903-8e23-973f208b4edd" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.400749 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b58c075-8b99-4903-8e23-973f208b4edd" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.401204 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b58c075-8b99-4903-8e23-973f208b4edd" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.401312 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f79e4ba-2854-4d57-9d76-391f020c62ce" containerName="mariadb-database-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.401397 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="62832609-e9cd-41a5-a77e-4fdbe35cd12e" containerName="mariadb-account-create" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.402383 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.409743 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.416870 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435096 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc8r\" (UniqueName: \"kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435175 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435237 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435288 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.435310 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536472 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc8r\" (UniqueName: \"kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.536742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.537581 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.538194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.538189 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.538883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.538917 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.573454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc8r\" (UniqueName: \"kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r\") pod \"dnsmasq-dns-77585f5f8c-4gkdr\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.695393 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-57wm2"] Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.696987 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.699365 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7hj8z" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.699786 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.716144 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-57wm2"] Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.724255 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.739186 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.739222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j67v7\" (UniqueName: \"kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.739267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.739324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.841259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.841324 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j67v7\" (UniqueName: \"kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.841424 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.841579 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.849318 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.852448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.863520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.868449 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j67v7\" (UniqueName: \"kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7\") pod \"glance-db-sync-57wm2\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.909247 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6kqsg" podUID="12d2ae39-f918-485b-a8c4-b083cdf9d48f" containerName="ovn-controller" probeResult="failure" output=< Sep 29 10:00:29 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 29 10:00:29 crc kubenswrapper[4922]: > Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.922097 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 10:00:29 crc kubenswrapper[4922]: I0929 10:00:29.927542 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bzdcj" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.018128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.164269 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6kqsg-config-ph4kc"] Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.167847 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.170630 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.182842 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg-config-ph4kc"] Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.312462 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.350990 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.351243 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.351335 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.351785 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.352095 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6944\" (UniqueName: \"kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.352222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6944\" (UniqueName: \"kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453430 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453511 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.453550 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.454462 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.454489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.454472 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.454808 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.456288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.474905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6944\" (UniqueName: \"kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944\") pod \"ovn-controller-6kqsg-config-ph4kc\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.512185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:30 crc kubenswrapper[4922]: I0929 10:00:30.698243 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-57wm2"] Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.009295 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg-config-ph4kc"] Sep 29 10:00:31 crc kubenswrapper[4922]: W0929 10:00:31.011113 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fafdb0_9ebd_4db2_9fb7_070572d52aca.slice/crio-0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958 WatchSource:0}: Error finding container 0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958: Status 404 returned error can't find the container with id 0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958 Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.098647 4922 generic.go:334] "Generic (PLEG): container finished" podID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerID="dcaad16f64c31493de3f7cf2e3fd7264a73357388e3cdb51c94e13195e6483d7" exitCode=0 Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.098863 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" event={"ID":"90d0dcdf-6643-403b-875f-6d1f3fd797c9","Type":"ContainerDied","Data":"dcaad16f64c31493de3f7cf2e3fd7264a73357388e3cdb51c94e13195e6483d7"} Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.099296 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" event={"ID":"90d0dcdf-6643-403b-875f-6d1f3fd797c9","Type":"ContainerStarted","Data":"6d9f1bab775f14a7a0f1d24d2da7fe580f29177cfca2a70ccee376e73393bc9a"} Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.101419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-ph4kc" event={"ID":"f8fafdb0-9ebd-4db2-9fb7-070572d52aca","Type":"ContainerStarted","Data":"0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958"} Sep 29 10:00:31 crc kubenswrapper[4922]: I0929 10:00:31.106675 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-57wm2" event={"ID":"8fd503cc-5b91-4ee8-b354-ada3ba37812a","Type":"ContainerStarted","Data":"483a376404b03ec277541d32a2d83a63318515b2173c7c54aeb5a0005259b36a"} Sep 29 10:00:32 crc kubenswrapper[4922]: I0929 10:00:32.119504 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" event={"ID":"90d0dcdf-6643-403b-875f-6d1f3fd797c9","Type":"ContainerStarted","Data":"f0a3a20c7d636fab40180967310e05f56eb26bfce339fdf3a99619e6715b2558"} Sep 29 10:00:32 crc kubenswrapper[4922]: I0929 10:00:32.120338 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:32 crc kubenswrapper[4922]: I0929 10:00:32.122470 4922 generic.go:334] "Generic (PLEG): container finished" podID="f8fafdb0-9ebd-4db2-9fb7-070572d52aca" containerID="c0c1513621326b7ee76988d10481ab49df8430052ed9ac09b6703845a7d4a027" exitCode=0 Sep 29 10:00:32 crc kubenswrapper[4922]: I0929 10:00:32.122564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-ph4kc" event={"ID":"f8fafdb0-9ebd-4db2-9fb7-070572d52aca","Type":"ContainerDied","Data":"c0c1513621326b7ee76988d10481ab49df8430052ed9ac09b6703845a7d4a027"} Sep 29 10:00:32 crc kubenswrapper[4922]: I0929 10:00:32.150412 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podStartSLOduration=3.150381537 podStartE2EDuration="3.150381537s" podCreationTimestamp="2025-09-29 10:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:32.141558438 +0000 UTC m=+957.507788722" watchObservedRunningTime="2025-09-29 10:00:32.150381537 +0000 UTC m=+957.516611801" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.481960 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616445 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616525 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616615 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616581 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run" (OuterVolumeSpecName: "var-run") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616725 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616775 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616816 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616841 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.616869 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6944\" (UniqueName: \"kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944\") pod \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\" (UID: \"f8fafdb0-9ebd-4db2-9fb7-070572d52aca\") " Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.617277 4922 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.617295 4922 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.617374 4922 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.617730 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.618172 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts" (OuterVolumeSpecName: "scripts") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.624161 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944" (OuterVolumeSpecName: "kube-api-access-t6944") pod "f8fafdb0-9ebd-4db2-9fb7-070572d52aca" (UID: "f8fafdb0-9ebd-4db2-9fb7-070572d52aca"). InnerVolumeSpecName "kube-api-access-t6944". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.719544 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.719912 4922 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:33 crc kubenswrapper[4922]: I0929 10:00:33.720041 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6944\" (UniqueName: \"kubernetes.io/projected/f8fafdb0-9ebd-4db2-9fb7-070572d52aca-kube-api-access-t6944\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.145728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-ph4kc" event={"ID":"f8fafdb0-9ebd-4db2-9fb7-070572d52aca","Type":"ContainerDied","Data":"0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958"} Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.146100 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a763024e21533ebc114ac7f94639454d2a6edef34e234e3683c886c0c601958" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.145815 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-ph4kc" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.610986 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6kqsg-config-ph4kc"] Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.636979 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6kqsg-config-ph4kc"] Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.705662 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6kqsg-config-kn6dq"] Sep 29 10:00:34 crc kubenswrapper[4922]: E0929 10:00:34.707300 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fafdb0-9ebd-4db2-9fb7-070572d52aca" containerName="ovn-config" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.707334 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fafdb0-9ebd-4db2-9fb7-070572d52aca" containerName="ovn-config" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.710292 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fafdb0-9ebd-4db2-9fb7-070572d52aca" containerName="ovn-config" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.711307 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.715484 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.718709 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg-config-kn6dq"] Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845242 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845659 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845749 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcszm\" (UniqueName: \"kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.845792 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.864468 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6kqsg" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948425 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcszm\" (UniqueName: \"kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948601 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.948686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.949093 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.949370 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.949431 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.949376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.951155 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:34 crc kubenswrapper[4922]: I0929 10:00:34.971983 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcszm\" (UniqueName: \"kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm\") pod \"ovn-controller-6kqsg-config-kn6dq\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:35 crc kubenswrapper[4922]: I0929 10:00:35.047413 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:35 crc kubenswrapper[4922]: I0929 10:00:35.350965 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kqsg-config-kn6dq"] Sep 29 10:00:35 crc kubenswrapper[4922]: W0929 10:00:35.365239 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a8ca4de_80d0_4219_bf2f_4a95a887db28.slice/crio-901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770 WatchSource:0}: Error finding container 901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770: Status 404 returned error can't find the container with id 901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770 Sep 29 10:00:35 crc kubenswrapper[4922]: I0929 10:00:35.477220 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fafdb0-9ebd-4db2-9fb7-070572d52aca" path="/var/lib/kubelet/pods/f8fafdb0-9ebd-4db2-9fb7-070572d52aca/volumes" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.146054 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.244654 4922 generic.go:334] "Generic (PLEG): container finished" podID="4a8ca4de-80d0-4219-bf2f-4a95a887db28" containerID="5ce738685b7ac4bcae711e4dae129553a69d4b6d2e7ea4ec3d0c889c1e33f51e" exitCode=0 Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.244894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-kn6dq" event={"ID":"4a8ca4de-80d0-4219-bf2f-4a95a887db28","Type":"ContainerDied","Data":"5ce738685b7ac4bcae711e4dae129553a69d4b6d2e7ea4ec3d0c889c1e33f51e"} Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.245065 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-kn6dq" event={"ID":"4a8ca4de-80d0-4219-bf2f-4a95a887db28","Type":"ContainerStarted","Data":"901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770"} Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.401617 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mcgfr"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.403524 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.409802 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mcgfr"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.503701 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qxvjk"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.504875 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.526642 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qxvjk"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.587731 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjw7w\" (UniqueName: \"kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w\") pod \"cinder-db-create-mcgfr\" (UID: \"6e9cd745-1b41-41dd-96a7-7e67ef51684d\") " pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.690696 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng796\" (UniqueName: \"kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796\") pod \"barbican-db-create-qxvjk\" (UID: \"9b8c7f07-5d37-4501-9c88-32cff699802f\") " pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.692344 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjw7w\" (UniqueName: \"kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w\") pod \"cinder-db-create-mcgfr\" (UID: \"6e9cd745-1b41-41dd-96a7-7e67ef51684d\") " pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.719115 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gmsv6"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.720051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjw7w\" (UniqueName: \"kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w\") pod \"cinder-db-create-mcgfr\" (UID: \"6e9cd745-1b41-41dd-96a7-7e67ef51684d\") " pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.720505 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.728818 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gmsv6"] Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.754249 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.794281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng796\" (UniqueName: \"kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796\") pod \"barbican-db-create-qxvjk\" (UID: \"9b8c7f07-5d37-4501-9c88-32cff699802f\") " pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.817045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng796\" (UniqueName: \"kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796\") pod \"barbican-db-create-qxvjk\" (UID: \"9b8c7f07-5d37-4501-9c88-32cff699802f\") " pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.820944 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.896047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7jr\" (UniqueName: \"kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr\") pod \"neutron-db-create-gmsv6\" (UID: \"7e007462-51b9-4640-94b1-019c85704aed\") " pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:36 crc kubenswrapper[4922]: I0929 10:00:36.998332 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7jr\" (UniqueName: \"kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr\") pod \"neutron-db-create-gmsv6\" (UID: \"7e007462-51b9-4640-94b1-019c85704aed\") " pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:37 crc kubenswrapper[4922]: I0929 10:00:37.022037 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7jr\" (UniqueName: \"kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr\") pod \"neutron-db-create-gmsv6\" (UID: \"7e007462-51b9-4640-94b1-019c85704aed\") " pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:37 crc kubenswrapper[4922]: I0929 10:00:37.076353 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.734765 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c110-account-create-mxfv8"] Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.736860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.742847 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c110-account-create-mxfv8"] Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.746420 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.837033 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5gp\" (UniqueName: \"kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp\") pod \"keystone-c110-account-create-mxfv8\" (UID: \"e1420681-6f1a-40a2-8176-32fcff81af93\") " pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.938691 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5gp\" (UniqueName: \"kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp\") pod \"keystone-c110-account-create-mxfv8\" (UID: \"e1420681-6f1a-40a2-8176-32fcff81af93\") " pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:38 crc kubenswrapper[4922]: I0929 10:00:38.976546 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5gp\" (UniqueName: \"kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp\") pod \"keystone-c110-account-create-mxfv8\" (UID: \"e1420681-6f1a-40a2-8176-32fcff81af93\") " pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:39 crc kubenswrapper[4922]: I0929 10:00:39.084145 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:39 crc kubenswrapper[4922]: I0929 10:00:39.727000 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:00:39 crc kubenswrapper[4922]: I0929 10:00:39.801181 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:39 crc kubenswrapper[4922]: I0929 10:00:39.812360 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-8nm2t" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="dnsmasq-dns" containerID="cri-o://acd183b54cfcc96088f01e7ef8f8f908c8a88e94de3bb6fcb1744c4614944f03" gracePeriod=10 Sep 29 10:00:40 crc kubenswrapper[4922]: I0929 10:00:40.286062 4922 generic.go:334] "Generic (PLEG): container finished" podID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerID="acd183b54cfcc96088f01e7ef8f8f908c8a88e94de3bb6fcb1744c4614944f03" exitCode=0 Sep 29 10:00:40 crc kubenswrapper[4922]: I0929 10:00:40.286148 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8nm2t" event={"ID":"3b99b68d-2f67-466e-88af-d60bc5d9d283","Type":"ContainerDied","Data":"acd183b54cfcc96088f01e7ef8f8f908c8a88e94de3bb6fcb1744c4614944f03"} Sep 29 10:00:41 crc kubenswrapper[4922]: I0929 10:00:41.925886 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-8nm2t" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.646468 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.721850 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.762662 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.762810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.762876 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcszm\" (UniqueName: \"kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.763011 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.763064 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.763193 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts\") pod \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\" (UID: \"4a8ca4de-80d0-4219-bf2f-4a95a887db28\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.764660 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.765620 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts" (OuterVolumeSpecName: "scripts") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.765651 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run" (OuterVolumeSpecName: "var-run") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.766107 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.766182 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.772312 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm" (OuterVolumeSpecName: "kube-api-access-lcszm") pod "4a8ca4de-80d0-4219-bf2f-4a95a887db28" (UID: "4a8ca4de-80d0-4219-bf2f-4a95a887db28"). InnerVolumeSpecName "kube-api-access-lcszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.865728 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb\") pod \"3b99b68d-2f67-466e-88af-d60bc5d9d283\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.865798 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config\") pod \"3b99b68d-2f67-466e-88af-d60bc5d9d283\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.865895 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc\") pod \"3b99b68d-2f67-466e-88af-d60bc5d9d283\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.865938 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb\") pod \"3b99b68d-2f67-466e-88af-d60bc5d9d283\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866146 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfbkk\" (UniqueName: \"kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk\") pod \"3b99b68d-2f67-466e-88af-d60bc5d9d283\" (UID: \"3b99b68d-2f67-466e-88af-d60bc5d9d283\") " Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866620 4922 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866638 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a8ca4de-80d0-4219-bf2f-4a95a887db28-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866654 4922 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866665 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcszm\" (UniqueName: \"kubernetes.io/projected/4a8ca4de-80d0-4219-bf2f-4a95a887db28-kube-api-access-lcszm\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866712 4922 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.866721 4922 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4a8ca4de-80d0-4219-bf2f-4a95a887db28-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.880561 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk" (OuterVolumeSpecName: "kube-api-access-pfbkk") pod "3b99b68d-2f67-466e-88af-d60bc5d9d283" (UID: "3b99b68d-2f67-466e-88af-d60bc5d9d283"). InnerVolumeSpecName "kube-api-access-pfbkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.914023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b99b68d-2f67-466e-88af-d60bc5d9d283" (UID: "3b99b68d-2f67-466e-88af-d60bc5d9d283"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.925096 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b99b68d-2f67-466e-88af-d60bc5d9d283" (UID: "3b99b68d-2f67-466e-88af-d60bc5d9d283"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.933078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b99b68d-2f67-466e-88af-d60bc5d9d283" (UID: "3b99b68d-2f67-466e-88af-d60bc5d9d283"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.940244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config" (OuterVolumeSpecName: "config") pod "3b99b68d-2f67-466e-88af-d60bc5d9d283" (UID: "3b99b68d-2f67-466e-88af-d60bc5d9d283"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.968602 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfbkk\" (UniqueName: \"kubernetes.io/projected/3b99b68d-2f67-466e-88af-d60bc5d9d283-kube-api-access-pfbkk\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.968657 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.968670 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.968680 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:43 crc kubenswrapper[4922]: I0929 10:00:43.968689 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b99b68d-2f67-466e-88af-d60bc5d9d283-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.009659 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c110-account-create-mxfv8"] Sep 29 10:00:44 crc kubenswrapper[4922]: W0929 10:00:44.034793 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1420681_6f1a_40a2_8176_32fcff81af93.slice/crio-6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f WatchSource:0}: Error finding container 6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f: Status 404 returned error can't find the container with id 6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.154278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gmsv6"] Sep 29 10:00:44 crc kubenswrapper[4922]: W0929 10:00:44.163258 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e007462_51b9_4640_94b1_019c85704aed.slice/crio-6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586 WatchSource:0}: Error finding container 6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586: Status 404 returned error can't find the container with id 6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586 Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.231321 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qxvjk"] Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.241699 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mcgfr"] Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.324946 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kqsg-config-kn6dq" event={"ID":"4a8ca4de-80d0-4219-bf2f-4a95a887db28","Type":"ContainerDied","Data":"901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.325003 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901f9015a824953a943a1d8c898891b9f0dbcc71a43bdd19ed6c5fe39b602770" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.325100 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kqsg-config-kn6dq" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.332496 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8nm2t" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.332454 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8nm2t" event={"ID":"3b99b68d-2f67-466e-88af-d60bc5d9d283","Type":"ContainerDied","Data":"c27f3edc2af2c051666dbaa912c3f7d6789eff3c41cf4ee1a131bbd5185165c9"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.332734 4922 scope.go:117] "RemoveContainer" containerID="acd183b54cfcc96088f01e7ef8f8f908c8a88e94de3bb6fcb1744c4614944f03" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.334270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gmsv6" event={"ID":"7e007462-51b9-4640-94b1-019c85704aed","Type":"ContainerStarted","Data":"6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.336180 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qxvjk" event={"ID":"9b8c7f07-5d37-4501-9c88-32cff699802f","Type":"ContainerStarted","Data":"64e43675f1bffae0ab94d7ebe6f0ad2a2e46fd4cfd5a035cb49fbfa7b130e357"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.337648 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mcgfr" event={"ID":"6e9cd745-1b41-41dd-96a7-7e67ef51684d","Type":"ContainerStarted","Data":"e8934f1043855d4629afd45c506a60fbdb54d5e2c695ff2c2ade178291a960e0"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.346006 4922 generic.go:334] "Generic (PLEG): container finished" podID="e1420681-6f1a-40a2-8176-32fcff81af93" containerID="83ac3deff58fb0730474627dd5e1c2225bc05ba71b520fa7b4a6d372f7bbca9d" exitCode=0 Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.346060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c110-account-create-mxfv8" event={"ID":"e1420681-6f1a-40a2-8176-32fcff81af93","Type":"ContainerDied","Data":"83ac3deff58fb0730474627dd5e1c2225bc05ba71b520fa7b4a6d372f7bbca9d"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.346095 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c110-account-create-mxfv8" event={"ID":"e1420681-6f1a-40a2-8176-32fcff81af93","Type":"ContainerStarted","Data":"6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f"} Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.382449 4922 scope.go:117] "RemoveContainer" containerID="1b8f692fdaee6b5ccf2bb9ec973d28b87c59fd55d9fb231d5002149484ede311" Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.393871 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.402169 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8nm2t"] Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.736421 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6kqsg-config-kn6dq"] Sep 29 10:00:44 crc kubenswrapper[4922]: I0929 10:00:44.741665 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6kqsg-config-kn6dq"] Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.359443 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b8c7f07-5d37-4501-9c88-32cff699802f" containerID="0c0a5d3986f52bd4a0cfe40389c3bcd945037a8d8ce88d86ebe1b0612b7d8d8d" exitCode=0 Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.360236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qxvjk" event={"ID":"9b8c7f07-5d37-4501-9c88-32cff699802f","Type":"ContainerDied","Data":"0c0a5d3986f52bd4a0cfe40389c3bcd945037a8d8ce88d86ebe1b0612b7d8d8d"} Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.361796 4922 generic.go:334] "Generic (PLEG): container finished" podID="6e9cd745-1b41-41dd-96a7-7e67ef51684d" containerID="7aac3f3cfcfccddafbdf693611a54a01f3b11a7739c4cebab1f3a5bded90f11d" exitCode=0 Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.361913 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mcgfr" event={"ID":"6e9cd745-1b41-41dd-96a7-7e67ef51684d","Type":"ContainerDied","Data":"7aac3f3cfcfccddafbdf693611a54a01f3b11a7739c4cebab1f3a5bded90f11d"} Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.363709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-57wm2" event={"ID":"8fd503cc-5b91-4ee8-b354-ada3ba37812a","Type":"ContainerStarted","Data":"d7f13a60bfe17a8e6b865b9e72792711035c6a568ca0bf6357b24e2b42bf6ae5"} Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.367861 4922 generic.go:334] "Generic (PLEG): container finished" podID="7e007462-51b9-4640-94b1-019c85704aed" containerID="a4b9d9eed4bdc55a21030da4539f5f95a670836403ee1647feec04ae0bfb5980" exitCode=0 Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.368201 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gmsv6" event={"ID":"7e007462-51b9-4640-94b1-019c85704aed","Type":"ContainerDied","Data":"a4b9d9eed4bdc55a21030da4539f5f95a670836403ee1647feec04ae0bfb5980"} Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.473167 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-57wm2" podStartSLOduration=3.491338094 podStartE2EDuration="16.473144321s" podCreationTimestamp="2025-09-29 10:00:29 +0000 UTC" firstStartedPulling="2025-09-29 10:00:30.721872516 +0000 UTC m=+956.088102790" lastFinishedPulling="2025-09-29 10:00:43.703678753 +0000 UTC m=+969.069909017" observedRunningTime="2025-09-29 10:00:45.460097887 +0000 UTC m=+970.826328191" watchObservedRunningTime="2025-09-29 10:00:45.473144321 +0000 UTC m=+970.839374585" Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.483802 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" path="/var/lib/kubelet/pods/3b99b68d-2f67-466e-88af-d60bc5d9d283/volumes" Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.485686 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8ca4de-80d0-4219-bf2f-4a95a887db28" path="/var/lib/kubelet/pods/4a8ca4de-80d0-4219-bf2f-4a95a887db28/volumes" Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.779469 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.792323 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.916711 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5gp\" (UniqueName: \"kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp\") pod \"e1420681-6f1a-40a2-8176-32fcff81af93\" (UID: \"e1420681-6f1a-40a2-8176-32fcff81af93\") " Sep 29 10:00:45 crc kubenswrapper[4922]: I0929 10:00:45.925633 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp" (OuterVolumeSpecName: "kube-api-access-7x5gp") pod "e1420681-6f1a-40a2-8176-32fcff81af93" (UID: "e1420681-6f1a-40a2-8176-32fcff81af93"). InnerVolumeSpecName "kube-api-access-7x5gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.018996 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x5gp\" (UniqueName: \"kubernetes.io/projected/e1420681-6f1a-40a2-8176-32fcff81af93-kube-api-access-7x5gp\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.386656 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c110-account-create-mxfv8" event={"ID":"e1420681-6f1a-40a2-8176-32fcff81af93","Type":"ContainerDied","Data":"6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f"} Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.387252 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8936b6bebc6b693798327dca5ac3283ff8ba8d1d039209c4530d125d69510f" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.387050 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c110-account-create-mxfv8" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.756570 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.912204 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.917987 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.959688 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng796\" (UniqueName: \"kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796\") pod \"9b8c7f07-5d37-4501-9c88-32cff699802f\" (UID: \"9b8c7f07-5d37-4501-9c88-32cff699802f\") " Sep 29 10:00:46 crc kubenswrapper[4922]: I0929 10:00:46.966884 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796" (OuterVolumeSpecName: "kube-api-access-ng796") pod "9b8c7f07-5d37-4501-9c88-32cff699802f" (UID: "9b8c7f07-5d37-4501-9c88-32cff699802f"). InnerVolumeSpecName "kube-api-access-ng796". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.061796 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjw7w\" (UniqueName: \"kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w\") pod \"6e9cd745-1b41-41dd-96a7-7e67ef51684d\" (UID: \"6e9cd745-1b41-41dd-96a7-7e67ef51684d\") " Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.061986 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp7jr\" (UniqueName: \"kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr\") pod \"7e007462-51b9-4640-94b1-019c85704aed\" (UID: \"7e007462-51b9-4640-94b1-019c85704aed\") " Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.062468 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng796\" (UniqueName: \"kubernetes.io/projected/9b8c7f07-5d37-4501-9c88-32cff699802f-kube-api-access-ng796\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.066206 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr" (OuterVolumeSpecName: "kube-api-access-bp7jr") pod "7e007462-51b9-4640-94b1-019c85704aed" (UID: "7e007462-51b9-4640-94b1-019c85704aed"). InnerVolumeSpecName "kube-api-access-bp7jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.066739 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w" (OuterVolumeSpecName: "kube-api-access-kjw7w") pod "6e9cd745-1b41-41dd-96a7-7e67ef51684d" (UID: "6e9cd745-1b41-41dd-96a7-7e67ef51684d"). InnerVolumeSpecName "kube-api-access-kjw7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.164936 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp7jr\" (UniqueName: \"kubernetes.io/projected/7e007462-51b9-4640-94b1-019c85704aed-kube-api-access-bp7jr\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.164988 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjw7w\" (UniqueName: \"kubernetes.io/projected/6e9cd745-1b41-41dd-96a7-7e67ef51684d-kube-api-access-kjw7w\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.397588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qxvjk" event={"ID":"9b8c7f07-5d37-4501-9c88-32cff699802f","Type":"ContainerDied","Data":"64e43675f1bffae0ab94d7ebe6f0ad2a2e46fd4cfd5a035cb49fbfa7b130e357"} Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.398116 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64e43675f1bffae0ab94d7ebe6f0ad2a2e46fd4cfd5a035cb49fbfa7b130e357" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.397865 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qxvjk" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.399065 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mcgfr" event={"ID":"6e9cd745-1b41-41dd-96a7-7e67ef51684d","Type":"ContainerDied","Data":"e8934f1043855d4629afd45c506a60fbdb54d5e2c695ff2c2ade178291a960e0"} Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.399114 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8934f1043855d4629afd45c506a60fbdb54d5e2c695ff2c2ade178291a960e0" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.399210 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mcgfr" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.400550 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gmsv6" event={"ID":"7e007462-51b9-4640-94b1-019c85704aed","Type":"ContainerDied","Data":"6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586"} Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.400616 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gmsv6" Sep 29 10:00:47 crc kubenswrapper[4922]: I0929 10:00:47.400641 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6076fdb69a1a4a1e38da95d0031b22d1cd6d50c37ed05754447a5718b22db586" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.333353 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-55rl7"] Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334203 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1420681-6f1a-40a2-8176-32fcff81af93" containerName="mariadb-account-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334219 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1420681-6f1a-40a2-8176-32fcff81af93" containerName="mariadb-account-create" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334230 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9cd745-1b41-41dd-96a7-7e67ef51684d" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334237 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9cd745-1b41-41dd-96a7-7e67ef51684d" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334264 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e007462-51b9-4640-94b1-019c85704aed" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334270 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e007462-51b9-4640-94b1-019c85704aed" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334287 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8c7f07-5d37-4501-9c88-32cff699802f" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334293 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8c7f07-5d37-4501-9c88-32cff699802f" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334303 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="dnsmasq-dns" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334310 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="dnsmasq-dns" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334318 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="init" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334325 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="init" Sep 29 10:00:49 crc kubenswrapper[4922]: E0929 10:00:49.334343 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8ca4de-80d0-4219-bf2f-4a95a887db28" containerName="ovn-config" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334348 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8ca4de-80d0-4219-bf2f-4a95a887db28" containerName="ovn-config" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334517 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b99b68d-2f67-466e-88af-d60bc5d9d283" containerName="dnsmasq-dns" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334529 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9cd745-1b41-41dd-96a7-7e67ef51684d" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334542 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8c7f07-5d37-4501-9c88-32cff699802f" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334551 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e007462-51b9-4640-94b1-019c85704aed" containerName="mariadb-database-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334565 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8ca4de-80d0-4219-bf2f-4a95a887db28" containerName="ovn-config" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.334575 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1420681-6f1a-40a2-8176-32fcff81af93" containerName="mariadb-account-create" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.335229 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.338936 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.347816 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.347920 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-94nrr" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.349774 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-55rl7"] Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.350491 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.508285 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.509019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.509085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcgj\" (UniqueName: \"kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.610332 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.610414 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcgj\" (UniqueName: \"kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.610570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.622433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.629937 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcgj\" (UniqueName: \"kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.635184 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle\") pod \"keystone-db-sync-55rl7\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.656160 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-55rl7" Sep 29 10:00:49 crc kubenswrapper[4922]: I0929 10:00:49.965027 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-55rl7"] Sep 29 10:00:50 crc kubenswrapper[4922]: I0929 10:00:50.433667 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-55rl7" event={"ID":"a5832265-b311-40bd-acbb-578a7c35814f","Type":"ContainerStarted","Data":"8864d30600b06c5aa28ea818a07f12708a9581041f60093d5396bf78b6e3e62d"} Sep 29 10:00:51 crc kubenswrapper[4922]: I0929 10:00:51.455677 4922 generic.go:334] "Generic (PLEG): container finished" podID="8fd503cc-5b91-4ee8-b354-ada3ba37812a" containerID="d7f13a60bfe17a8e6b865b9e72792711035c6a568ca0bf6357b24e2b42bf6ae5" exitCode=0 Sep 29 10:00:51 crc kubenswrapper[4922]: I0929 10:00:51.463411 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-57wm2" event={"ID":"8fd503cc-5b91-4ee8-b354-ada3ba37812a","Type":"ContainerDied","Data":"d7f13a60bfe17a8e6b865b9e72792711035c6a568ca0bf6357b24e2b42bf6ae5"} Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.470324 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.502083 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-57wm2" event={"ID":"8fd503cc-5b91-4ee8-b354-ada3ba37812a","Type":"ContainerDied","Data":"483a376404b03ec277541d32a2d83a63318515b2173c7c54aeb5a0005259b36a"} Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.502288 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="483a376404b03ec277541d32a2d83a63318515b2173c7c54aeb5a0005259b36a" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.502485 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-57wm2" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.619505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle\") pod \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.619967 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j67v7\" (UniqueName: \"kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7\") pod \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.620190 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data\") pod \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.620286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data\") pod \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\" (UID: \"8fd503cc-5b91-4ee8-b354-ada3ba37812a\") " Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.624586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8fd503cc-5b91-4ee8-b354-ada3ba37812a" (UID: "8fd503cc-5b91-4ee8-b354-ada3ba37812a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.624744 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7" (OuterVolumeSpecName: "kube-api-access-j67v7") pod "8fd503cc-5b91-4ee8-b354-ada3ba37812a" (UID: "8fd503cc-5b91-4ee8-b354-ada3ba37812a"). InnerVolumeSpecName "kube-api-access-j67v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.642664 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fd503cc-5b91-4ee8-b354-ada3ba37812a" (UID: "8fd503cc-5b91-4ee8-b354-ada3ba37812a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.674735 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data" (OuterVolumeSpecName: "config-data") pod "8fd503cc-5b91-4ee8-b354-ada3ba37812a" (UID: "8fd503cc-5b91-4ee8-b354-ada3ba37812a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.723462 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.723513 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j67v7\" (UniqueName: \"kubernetes.io/projected/8fd503cc-5b91-4ee8-b354-ada3ba37812a-kube-api-access-j67v7\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.723532 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:54 crc kubenswrapper[4922]: I0929 10:00:54.723548 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fd503cc-5b91-4ee8-b354-ada3ba37812a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.519481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-55rl7" event={"ID":"a5832265-b311-40bd-acbb-578a7c35814f","Type":"ContainerStarted","Data":"1401802f4c7bf771ed6fc6c0336e24190184d7386012ee75c71b6f7682ff4d76"} Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.559181 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-55rl7" podStartSLOduration=2.081584886 podStartE2EDuration="6.559152866s" podCreationTimestamp="2025-09-29 10:00:49 +0000 UTC" firstStartedPulling="2025-09-29 10:00:49.974556256 +0000 UTC m=+975.340786520" lastFinishedPulling="2025-09-29 10:00:54.452124196 +0000 UTC m=+979.818354500" observedRunningTime="2025-09-29 10:00:55.5522607 +0000 UTC m=+980.918491004" watchObservedRunningTime="2025-09-29 10:00:55.559152866 +0000 UTC m=+980.925383140" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.872459 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:00:55 crc kubenswrapper[4922]: E0929 10:00:55.875548 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd503cc-5b91-4ee8-b354-ada3ba37812a" containerName="glance-db-sync" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.875576 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd503cc-5b91-4ee8-b354-ada3ba37812a" containerName="glance-db-sync" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.875794 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd503cc-5b91-4ee8-b354-ada3ba37812a" containerName="glance-db-sync" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.877021 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.900655 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956671 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956794 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:55 crc kubenswrapper[4922]: I0929 10:00:55.956810 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwdh\" (UniqueName: \"kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058630 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058679 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058755 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.058811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwdh\" (UniqueName: \"kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.059878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.059914 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.060377 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.060609 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.060686 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.090208 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwdh\" (UniqueName: \"kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh\") pod \"dnsmasq-dns-7ff5475cc9-86k6k\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.195863 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.589465 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ccc8-account-create-ss7k4"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.591221 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.595364 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.612060 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ccc8-account-create-ss7k4"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.716555 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:00:56 crc kubenswrapper[4922]: W0929 10:00:56.721468 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143239b5_b418_41a0_acac_1a79af28f313.slice/crio-f8168216fd903343845d9395a532f0930ede11f44ff586770ba76d112dc902a1 WatchSource:0}: Error finding container f8168216fd903343845d9395a532f0930ede11f44ff586770ba76d112dc902a1: Status 404 returned error can't find the container with id f8168216fd903343845d9395a532f0930ede11f44ff586770ba76d112dc902a1 Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.762664 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e940-account-create-6pnx5"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.763976 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.767663 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.774814 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drlx\" (UniqueName: \"kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx\") pod \"barbican-ccc8-account-create-ss7k4\" (UID: \"6abd65ed-7467-4e06-91a3-8190c697c779\") " pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.780942 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e940-account-create-6pnx5"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.872720 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-902c-account-create-d724p"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.876339 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2drlx\" (UniqueName: \"kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx\") pod \"barbican-ccc8-account-create-ss7k4\" (UID: \"6abd65ed-7467-4e06-91a3-8190c697c779\") " pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.876510 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqn7w\" (UniqueName: \"kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w\") pod \"cinder-e940-account-create-6pnx5\" (UID: \"51967537-baef-4e1f-a056-fc90648a3193\") " pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.877559 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.879474 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.882313 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-902c-account-create-d724p"] Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.901117 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drlx\" (UniqueName: \"kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx\") pod \"barbican-ccc8-account-create-ss7k4\" (UID: \"6abd65ed-7467-4e06-91a3-8190c697c779\") " pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.932774 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:00:56 crc kubenswrapper[4922]: I0929 10:00:56.977821 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqn7w\" (UniqueName: \"kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w\") pod \"cinder-e940-account-create-6pnx5\" (UID: \"51967537-baef-4e1f-a056-fc90648a3193\") " pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.002480 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqn7w\" (UniqueName: \"kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w\") pod \"cinder-e940-account-create-6pnx5\" (UID: \"51967537-baef-4e1f-a056-fc90648a3193\") " pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.079641 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t5fb\" (UniqueName: \"kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb\") pod \"neutron-902c-account-create-d724p\" (UID: \"d471b234-b711-4771-a1c7-818c56789a93\") " pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.183662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t5fb\" (UniqueName: \"kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb\") pod \"neutron-902c-account-create-d724p\" (UID: \"d471b234-b711-4771-a1c7-818c56789a93\") " pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.205525 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t5fb\" (UniqueName: \"kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb\") pod \"neutron-902c-account-create-d724p\" (UID: \"d471b234-b711-4771-a1c7-818c56789a93\") " pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.291297 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.312754 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.487264 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ccc8-account-create-ss7k4"] Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.553967 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ccc8-account-create-ss7k4" event={"ID":"6abd65ed-7467-4e06-91a3-8190c697c779","Type":"ContainerStarted","Data":"405dbf2a511dea5f933e8d8f4ed78287ea446d755a5d7cec47fb6aa8cd766d24"} Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.556820 4922 generic.go:334] "Generic (PLEG): container finished" podID="143239b5-b418-41a0-acac-1a79af28f313" containerID="ec6bf3aa3d1dc98ec5adfe5061b5f64352fec5e0b036b94a7396a2ca47fead68" exitCode=0 Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.556903 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" event={"ID":"143239b5-b418-41a0-acac-1a79af28f313","Type":"ContainerDied","Data":"ec6bf3aa3d1dc98ec5adfe5061b5f64352fec5e0b036b94a7396a2ca47fead68"} Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.558599 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" event={"ID":"143239b5-b418-41a0-acac-1a79af28f313","Type":"ContainerStarted","Data":"f8168216fd903343845d9395a532f0930ede11f44ff586770ba76d112dc902a1"} Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.788972 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e940-account-create-6pnx5"] Sep 29 10:00:57 crc kubenswrapper[4922]: W0929 10:00:57.803429 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51967537_baef_4e1f_a056_fc90648a3193.slice/crio-3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b WatchSource:0}: Error finding container 3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b: Status 404 returned error can't find the container with id 3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b Sep 29 10:00:57 crc kubenswrapper[4922]: I0929 10:00:57.903611 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-902c-account-create-d724p"] Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.569915 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" event={"ID":"143239b5-b418-41a0-acac-1a79af28f313","Type":"ContainerStarted","Data":"317945c6901c39e438e250b5e72cf88c9149017dbd65ed188d4e9c11b4a1a646"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.570410 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.573161 4922 generic.go:334] "Generic (PLEG): container finished" podID="d471b234-b711-4771-a1c7-818c56789a93" containerID="3f34acd702c80793661ad84371ea0d7601da79129b46adb57966367a6bf794da" exitCode=0 Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.573252 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-902c-account-create-d724p" event={"ID":"d471b234-b711-4771-a1c7-818c56789a93","Type":"ContainerDied","Data":"3f34acd702c80793661ad84371ea0d7601da79129b46adb57966367a6bf794da"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.573290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-902c-account-create-d724p" event={"ID":"d471b234-b711-4771-a1c7-818c56789a93","Type":"ContainerStarted","Data":"5033b41acf16b2d77c051545fbfc4c1a10f8227fc51459a9965355601a22ab49"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.581020 4922 generic.go:334] "Generic (PLEG): container finished" podID="6abd65ed-7467-4e06-91a3-8190c697c779" containerID="eb648f4baca8d0fce51976e46dba225624eb8e66bb4a457c728ece1cb525cd61" exitCode=0 Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.581105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ccc8-account-create-ss7k4" event={"ID":"6abd65ed-7467-4e06-91a3-8190c697c779","Type":"ContainerDied","Data":"eb648f4baca8d0fce51976e46dba225624eb8e66bb4a457c728ece1cb525cd61"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.586288 4922 generic.go:334] "Generic (PLEG): container finished" podID="51967537-baef-4e1f-a056-fc90648a3193" containerID="a53e2d45a2a7911f9babbdf279dad97f43b638caf0657eccca37a16a3458a8be" exitCode=0 Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.586346 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e940-account-create-6pnx5" event={"ID":"51967537-baef-4e1f-a056-fc90648a3193","Type":"ContainerDied","Data":"a53e2d45a2a7911f9babbdf279dad97f43b638caf0657eccca37a16a3458a8be"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.586379 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e940-account-create-6pnx5" event={"ID":"51967537-baef-4e1f-a056-fc90648a3193","Type":"ContainerStarted","Data":"3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b"} Sep 29 10:00:58 crc kubenswrapper[4922]: I0929 10:00:58.608400 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" podStartSLOduration=3.608376463 podStartE2EDuration="3.608376463s" podCreationTimestamp="2025-09-29 10:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:00:58.604367024 +0000 UTC m=+983.970597288" watchObservedRunningTime="2025-09-29 10:00:58.608376463 +0000 UTC m=+983.974606737" Sep 29 10:00:59 crc kubenswrapper[4922]: I0929 10:00:59.597196 4922 generic.go:334] "Generic (PLEG): container finished" podID="a5832265-b311-40bd-acbb-578a7c35814f" containerID="1401802f4c7bf771ed6fc6c0336e24190184d7386012ee75c71b6f7682ff4d76" exitCode=0 Sep 29 10:00:59 crc kubenswrapper[4922]: I0929 10:00:59.597408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-55rl7" event={"ID":"a5832265-b311-40bd-acbb-578a7c35814f","Type":"ContainerDied","Data":"1401802f4c7bf771ed6fc6c0336e24190184d7386012ee75c71b6f7682ff4d76"} Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.023348 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.128542 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.131154 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.195163 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqn7w\" (UniqueName: \"kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w\") pod \"51967537-baef-4e1f-a056-fc90648a3193\" (UID: \"51967537-baef-4e1f-a056-fc90648a3193\") " Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.195572 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2drlx\" (UniqueName: \"kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx\") pod \"6abd65ed-7467-4e06-91a3-8190c697c779\" (UID: \"6abd65ed-7467-4e06-91a3-8190c697c779\") " Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.195707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t5fb\" (UniqueName: \"kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb\") pod \"d471b234-b711-4771-a1c7-818c56789a93\" (UID: \"d471b234-b711-4771-a1c7-818c56789a93\") " Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.213026 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx" (OuterVolumeSpecName: "kube-api-access-2drlx") pod "6abd65ed-7467-4e06-91a3-8190c697c779" (UID: "6abd65ed-7467-4e06-91a3-8190c697c779"). InnerVolumeSpecName "kube-api-access-2drlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.213340 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb" (OuterVolumeSpecName: "kube-api-access-6t5fb") pod "d471b234-b711-4771-a1c7-818c56789a93" (UID: "d471b234-b711-4771-a1c7-818c56789a93"). InnerVolumeSpecName "kube-api-access-6t5fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.216186 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w" (OuterVolumeSpecName: "kube-api-access-fqn7w") pod "51967537-baef-4e1f-a056-fc90648a3193" (UID: "51967537-baef-4e1f-a056-fc90648a3193"). InnerVolumeSpecName "kube-api-access-fqn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.297088 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2drlx\" (UniqueName: \"kubernetes.io/projected/6abd65ed-7467-4e06-91a3-8190c697c779-kube-api-access-2drlx\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.297126 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t5fb\" (UniqueName: \"kubernetes.io/projected/d471b234-b711-4771-a1c7-818c56789a93-kube-api-access-6t5fb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.297139 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqn7w\" (UniqueName: \"kubernetes.io/projected/51967537-baef-4e1f-a056-fc90648a3193-kube-api-access-fqn7w\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.610757 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e940-account-create-6pnx5" event={"ID":"51967537-baef-4e1f-a056-fc90648a3193","Type":"ContainerDied","Data":"3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b"} Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.610814 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b595d88bf1cc39a0d2b937dd18b7abae78747ce005616f43cedb8b9aec9963b" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.611064 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e940-account-create-6pnx5" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.612409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-902c-account-create-d724p" event={"ID":"d471b234-b711-4771-a1c7-818c56789a93","Type":"ContainerDied","Data":"5033b41acf16b2d77c051545fbfc4c1a10f8227fc51459a9965355601a22ab49"} Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.612460 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5033b41acf16b2d77c051545fbfc4c1a10f8227fc51459a9965355601a22ab49" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.612482 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-902c-account-create-d724p" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.613998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ccc8-account-create-ss7k4" event={"ID":"6abd65ed-7467-4e06-91a3-8190c697c779","Type":"ContainerDied","Data":"405dbf2a511dea5f933e8d8f4ed78287ea446d755a5d7cec47fb6aa8cd766d24"} Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.614028 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ccc8-account-create-ss7k4" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.614034 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405dbf2a511dea5f933e8d8f4ed78287ea446d755a5d7cec47fb6aa8cd766d24" Sep 29 10:01:00 crc kubenswrapper[4922]: I0929 10:01:00.907174 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-55rl7" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.012105 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kcgj\" (UniqueName: \"kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj\") pod \"a5832265-b311-40bd-acbb-578a7c35814f\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.012177 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle\") pod \"a5832265-b311-40bd-acbb-578a7c35814f\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.012226 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data\") pod \"a5832265-b311-40bd-acbb-578a7c35814f\" (UID: \"a5832265-b311-40bd-acbb-578a7c35814f\") " Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.017810 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj" (OuterVolumeSpecName: "kube-api-access-8kcgj") pod "a5832265-b311-40bd-acbb-578a7c35814f" (UID: "a5832265-b311-40bd-acbb-578a7c35814f"). InnerVolumeSpecName "kube-api-access-8kcgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.038976 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5832265-b311-40bd-acbb-578a7c35814f" (UID: "a5832265-b311-40bd-acbb-578a7c35814f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.056362 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data" (OuterVolumeSpecName: "config-data") pod "a5832265-b311-40bd-acbb-578a7c35814f" (UID: "a5832265-b311-40bd-acbb-578a7c35814f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.115969 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kcgj\" (UniqueName: \"kubernetes.io/projected/a5832265-b311-40bd-acbb-578a7c35814f-kube-api-access-8kcgj\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.116644 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.116693 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5832265-b311-40bd-acbb-578a7c35814f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.625716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-55rl7" event={"ID":"a5832265-b311-40bd-acbb-578a7c35814f","Type":"ContainerDied","Data":"8864d30600b06c5aa28ea818a07f12708a9581041f60093d5396bf78b6e3e62d"} Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.625769 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8864d30600b06c5aa28ea818a07f12708a9581041f60093d5396bf78b6e3e62d" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.625812 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-55rl7" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.824461 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.824721 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="dnsmasq-dns" containerID="cri-o://317945c6901c39e438e250b5e72cf88c9149017dbd65ed188d4e9c11b4a1a646" gracePeriod=10 Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.854805 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c7bk8"] Sep 29 10:01:01 crc kubenswrapper[4922]: E0929 10:01:01.855319 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d471b234-b711-4771-a1c7-818c56789a93" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855345 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d471b234-b711-4771-a1c7-818c56789a93" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: E0929 10:01:01.855358 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51967537-baef-4e1f-a056-fc90648a3193" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855366 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="51967537-baef-4e1f-a056-fc90648a3193" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: E0929 10:01:01.855403 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abd65ed-7467-4e06-91a3-8190c697c779" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855412 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abd65ed-7467-4e06-91a3-8190c697c779" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: E0929 10:01:01.855434 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5832265-b311-40bd-acbb-578a7c35814f" containerName="keystone-db-sync" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855442 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5832265-b311-40bd-acbb-578a7c35814f" containerName="keystone-db-sync" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855645 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d471b234-b711-4771-a1c7-818c56789a93" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855668 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="51967537-baef-4e1f-a056-fc90648a3193" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855682 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6abd65ed-7467-4e06-91a3-8190c697c779" containerName="mariadb-account-create" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.855700 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5832265-b311-40bd-acbb-578a7c35814f" containerName="keystone-db-sync" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.856493 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.858990 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-94nrr" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.859474 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.859896 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.864114 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.894305 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7bk8"] Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.915800 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.918149 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kczx5\" (UniqueName: \"kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935677 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hq86\" (UniqueName: \"kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935768 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.935986 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.936014 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.936049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.936119 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.936177 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:01 crc kubenswrapper[4922]: I0929 10:01:01.980130 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039099 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039194 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039219 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kczx5\" (UniqueName: \"kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039292 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hq86\" (UniqueName: \"kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039315 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039345 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.039370 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.041766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.042367 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.042762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.043333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.051569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.052413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.058968 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.062622 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.069633 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.071346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.083864 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.085345 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.095517 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kczx5\" (UniqueName: \"kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5\") pod \"keystone-bootstrap-c7bk8\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.096175 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hq86\" (UniqueName: \"kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86\") pod \"dnsmasq-dns-5c5cc7c5ff-rshpw\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.096320 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-r7mw6" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.100434 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.100757 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.100927 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.144895 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.146618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.146678 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.146700 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.146806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.146871 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwwnp\" (UniqueName: \"kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.174491 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.185414 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7lxww"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.186632 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.193208 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.193426 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.193571 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-sj2gm" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.232573 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7lxww"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.246007 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jv255"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248489 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwwnp\" (UniqueName: \"kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248536 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trx5g\" (UniqueName: \"kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248578 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248617 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248640 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248663 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248707 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248789 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.248819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.249519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.250115 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.250920 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.251858 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.253548 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.254079 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z57ns" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.254205 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.257307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.270987 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jv255"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.318630 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwwnp\" (UniqueName: \"kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp\") pod \"horizon-d6548cb57-w2b9m\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trx5g\" (UniqueName: \"kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354669 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354803 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.354935 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.373904 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.377041 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.378692 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.382758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.410877 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trx5g\" (UniqueName: \"kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.411142 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data\") pod \"cinder-db-sync-7lxww\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.438606 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.456416 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.456479 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.456522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.456557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.456586 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfbw\" (UniqueName: \"kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.478756 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dtwl5"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.491251 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.500472 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.500782 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gzpqd" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.501388 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.503584 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.509433 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.516362 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtwl5"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.542870 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.556929 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.558856 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.558887 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.558925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.558960 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.558990 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfbw\" (UniqueName: \"kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.561174 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nxhsm"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.563066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.568076 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.591469 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.593187 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.593427 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ldrm5" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.598215 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.600494 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.600571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.604815 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:02 crc kubenswrapper[4922]: I0929 10:01:02.606543 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.613182 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.624507 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nxhsm"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.624657 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.628115 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfbw\" (UniqueName: \"kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw\") pod \"placement-db-sync-jv255\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " pod="openstack/placement-db-sync-jv255" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.686767 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.690642 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.694402 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695193 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsrmw\" (UniqueName: \"kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf8qn\" (UniqueName: \"kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695416 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695439 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695457 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6nlz\" (UniqueName: \"kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695498 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695528 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfb9s\" (UniqueName: \"kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695549 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695575 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.695696 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.702169 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7hj8z" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.702391 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.741348 4922 generic.go:334] "Generic (PLEG): container finished" podID="143239b5-b418-41a0-acac-1a79af28f313" containerID="317945c6901c39e438e250b5e72cf88c9149017dbd65ed188d4e9c11b4a1a646" exitCode=0 Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.741717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" event={"ID":"143239b5-b418-41a0-acac-1a79af28f313","Type":"ContainerDied","Data":"317945c6901c39e438e250b5e72cf88c9149017dbd65ed188d4e9c11b4a1a646"} Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.742004 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv255" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.748935 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.767924 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803247 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803297 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-577w9\" (UniqueName: \"kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803295 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803357 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803375 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803397 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803413 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803483 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsrmw\" (UniqueName: \"kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf8qn\" (UniqueName: \"kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803523 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803582 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803641 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6nlz\" (UniqueName: \"kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803679 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803721 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfb9s\" (UniqueName: \"kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803782 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803803 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.803820 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.805015 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.805105 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.805639 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.806143 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.813272 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.813541 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.814972 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.815773 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.819111 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.844222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.851637 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.874984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfb9s\" (UniqueName: \"kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.875547 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf8qn\" (UniqueName: \"kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn\") pod \"dnsmasq-dns-8b5c85b87-lrmb8\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.876333 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.883800 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.889084 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.898463 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6nlz\" (UniqueName: \"kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz\") pod \"horizon-587c5959fc-qq5nc\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.898883 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.899933 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.902401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data\") pod \"barbican-db-sync-dtwl5\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907328 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907396 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907445 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907511 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907553 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907654 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-577w9\" (UniqueName: \"kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907711 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907765 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907799 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907856 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907890 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907953 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dmb\" (UniqueName: \"kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.907981 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.908020 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.909962 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsrmw\" (UniqueName: \"kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw\") pod \"neutron-db-sync-nxhsm\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.911761 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.912718 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.913130 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.921863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.932453 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.950634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-577w9\" (UniqueName: \"kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.951148 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.951230 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.952094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: E0929 10:01:02.956797 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="dnsmasq-dns" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.956978 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="dnsmasq-dns" Sep 29 10:01:03 crc kubenswrapper[4922]: E0929 10:01:02.956997 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="init" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.957005 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="init" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.957248 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="143239b5-b418-41a0-acac-1a79af28f313" containerName="dnsmasq-dns" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.971585 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.975324 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.975596 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.977506 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.979726 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.990118 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:02.997377 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.008789 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.008846 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.008942 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.008980 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwdh\" (UniqueName: \"kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009159 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0\") pod \"143239b5-b418-41a0-acac-1a79af28f313\" (UID: \"143239b5-b418-41a0-acac-1a79af28f313\") " Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dmb\" (UniqueName: \"kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009438 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009457 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009520 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009620 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.009680 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.010640 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.021792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.023520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.024523 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.028181 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.048500 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.060472 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh" (OuterVolumeSpecName: "kube-api-access-sgwdh") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "kube-api-access-sgwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.062163 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.094312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.095415 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dmb\" (UniqueName: \"kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.111794 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.111881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.111943 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.111973 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.112005 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.112047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.112207 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wns\" (UniqueName: \"kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.112470 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwdh\" (UniqueName: \"kubernetes.io/projected/143239b5-b418-41a0-acac-1a79af28f313-kube-api-access-sgwdh\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.148323 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.160105 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config" (OuterVolumeSpecName: "config") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.167051 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.185423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.216869 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.219667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.225653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227369 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227454 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227541 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wns\" (UniqueName: \"kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227573 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.227974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.228057 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.228978 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.228994 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.229004 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.229493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.229811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.235475 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.237664 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.240152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.253813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.258554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wns\" (UniqueName: \"kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.259179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data\") pod \"ceilometer-0\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.268629 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "143239b5-b418-41a0-acac-1a79af28f313" (UID: "143239b5-b418-41a0-acac-1a79af28f313"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.295804 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.332950 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.332984 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/143239b5-b418-41a0-acac-1a79af28f313-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.753460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" event={"ID":"143239b5-b418-41a0-acac-1a79af28f313","Type":"ContainerDied","Data":"f8168216fd903343845d9395a532f0930ede11f44ff586770ba76d112dc902a1"} Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.753560 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-86k6k" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.753855 4922 scope.go:117] "RemoveContainer" containerID="317945c6901c39e438e250b5e72cf88c9149017dbd65ed188d4e9c11b4a1a646" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.782064 4922 scope.go:117] "RemoveContainer" containerID="ec6bf3aa3d1dc98ec5adfe5061b5f64352fec5e0b036b94a7396a2ca47fead68" Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.784778 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:01:03 crc kubenswrapper[4922]: I0929 10:01:03.793446 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-86k6k"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.177091 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7bk8"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.208799 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtwl5"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.219627 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.245619 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.309263 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.407682 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nxhsm"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.432421 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jv255"] Sep 29 10:01:04 crc kubenswrapper[4922]: W0929 10:01:04.437463 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4b100f6_2a77_43b8_8942_0e50151142d0.slice/crio-a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b WatchSource:0}: Error finding container a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b: Status 404 returned error can't find the container with id a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b Sep 29 10:01:04 crc kubenswrapper[4922]: W0929 10:01:04.438610 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ae426f_29ac_46dd_a865_41b4c4a0e722.slice/crio-728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e WatchSource:0}: Error finding container 728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e: Status 404 returned error can't find the container with id 728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.442204 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.451986 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7lxww"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.463958 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.484869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.489020 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: W0929 10:01:04.493960 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfd5a1f1_eb6d_450c_95a3_ca487095f510.slice/crio-ff458952d3dcb045fe1eec0eb1140eff482e877f8361eb8202dbe39d45644ec3 WatchSource:0}: Error finding container ff458952d3dcb045fe1eec0eb1140eff482e877f8361eb8202dbe39d45644ec3: Status 404 returned error can't find the container with id ff458952d3dcb045fe1eec0eb1140eff482e877f8361eb8202dbe39d45644ec3 Sep 29 10:01:04 crc kubenswrapper[4922]: W0929 10:01:04.601806 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007f8331_533d_439d_beec_80291b7c4a0d.slice/crio-31b3ace0c13d87d6280891caee7af8e2c40dd37cf6b6dd2af6a395dc4d5272cf WatchSource:0}: Error finding container 31b3ace0c13d87d6280891caee7af8e2c40dd37cf6b6dd2af6a395dc4d5272cf: Status 404 returned error can't find the container with id 31b3ace0c13d87d6280891caee7af8e2c40dd37cf6b6dd2af6a395dc4d5272cf Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.609841 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.643935 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.657874 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.710011 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.716195 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.722316 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.736698 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.783508 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.783611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.783763 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.783809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.784136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qm5x\" (UniqueName: \"kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.804617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lxww" event={"ID":"0c2d9bba-864b-468d-923e-23cf0544daf9","Type":"ContainerStarted","Data":"e2993ab9bd9464d336d99965fd23d36f29dfe04f8e7ff013f52028dff31fa57b"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.806659 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv255" event={"ID":"e4b100f6-2a77-43b8-8942-0e50151142d0","Type":"ContainerStarted","Data":"a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.809596 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d6548cb57-w2b9m" event={"ID":"bfd5a1f1-eb6d-450c-95a3-ca487095f510","Type":"ContainerStarted","Data":"ff458952d3dcb045fe1eec0eb1140eff482e877f8361eb8202dbe39d45644ec3"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.815968 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerStarted","Data":"31b3ace0c13d87d6280891caee7af8e2c40dd37cf6b6dd2af6a395dc4d5272cf"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.818938 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" event={"ID":"7d147102-15c5-4d72-814b-5c5a52263122","Type":"ContainerStarted","Data":"7781910e65102113253e54104bb0d62a24e276e19d50a41feae4df038e17f6fb"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.827318 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7bk8" event={"ID":"43132039-b72b-4380-9db8-571b746d0f0b","Type":"ContainerStarted","Data":"a983506c905121861874a861f7b04dd5f7d41454b3c4bd6730d6849a8d4cc35c"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.827411 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7bk8" event={"ID":"43132039-b72b-4380-9db8-571b746d0f0b","Type":"ContainerStarted","Data":"5946cfefeceaf6cdb55796e4f06a17f4541e97970c78d0ed5023abc7243cf960"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.831759 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerStarted","Data":"a4b3d9b268616bfaa41d1f154d60e82a648bd66021aacc32d99c3b23685a0b04"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.842020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nxhsm" event={"ID":"27ae426f-29ac-46dd-a865-41b4c4a0e722","Type":"ContainerStarted","Data":"728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.844502 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c5959fc-qq5nc" event={"ID":"87c15220-301e-4bd3-8dca-cfe5773fd469","Type":"ContainerStarted","Data":"ee88b34000d502d2e7f773a9655ac067fa39ae5328cfc49d3319ff86f2cb0630"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.853708 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtwl5" event={"ID":"ed54c52a-229b-45f0-8526-19d6ca42237c","Type":"ContainerStarted","Data":"b2ab479284f23949e46567e4d9c2165fd21b2f696bc3f835a785432210be4fcf"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.858448 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c7bk8" podStartSLOduration=3.85662727 podStartE2EDuration="3.85662727s" podCreationTimestamp="2025-09-29 10:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:04.854251755 +0000 UTC m=+990.220482019" watchObservedRunningTime="2025-09-29 10:01:04.85662727 +0000 UTC m=+990.222857554" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.889584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qm5x\" (UniqueName: \"kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.890091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.890153 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.890237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.890395 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.892509 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.893180 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.893543 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.908956 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.922903 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qm5x\" (UniqueName: \"kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x\") pod \"horizon-7dc6cd987c-j7ptj\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.941468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerStarted","Data":"b1f6cc63b14fab631a4a59a4e40925f25050fde4662993f52e01c6beafdf275b"} Sep 29 10:01:04 crc kubenswrapper[4922]: I0929 10:01:04.943709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" event={"ID":"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63","Type":"ContainerStarted","Data":"af56c2596d1c7f0d8a16d8fe5e4ca87d96084ddfa32d2e7ec8568f5a2d83515c"} Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.083346 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.469445 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143239b5-b418-41a0-acac-1a79af28f313" path="/var/lib/kubelet/pods/143239b5-b418-41a0-acac-1a79af28f313/volumes" Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.720611 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:05 crc kubenswrapper[4922]: W0929 10:01:05.747198 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa91436_e778_43bb_b052_6e5e9928a705.slice/crio-3fefc4643b235bdc75dfcbcbe01b34699dc5180e89085eeee85bb7d7935fc2de WatchSource:0}: Error finding container 3fefc4643b235bdc75dfcbcbe01b34699dc5180e89085eeee85bb7d7935fc2de: Status 404 returned error can't find the container with id 3fefc4643b235bdc75dfcbcbe01b34699dc5180e89085eeee85bb7d7935fc2de Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.967004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerStarted","Data":"8ce56584818ded922b8133ee5269042537847d075733a910e2ede7848a724a91"} Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.970872 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d147102-15c5-4d72-814b-5c5a52263122" containerID="e37b154363b1b0f7188073a01c309da92393685059f98f7cf5f8d40bba409d2a" exitCode=0 Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.971079 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" event={"ID":"7d147102-15c5-4d72-814b-5c5a52263122","Type":"ContainerDied","Data":"e37b154363b1b0f7188073a01c309da92393685059f98f7cf5f8d40bba409d2a"} Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.977266 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nxhsm" event={"ID":"27ae426f-29ac-46dd-a865-41b4c4a0e722","Type":"ContainerStarted","Data":"0bea57f17394c642e23109b448bdbb1ffe261d1c274d5716f320f14a775b8168"} Sep 29 10:01:05 crc kubenswrapper[4922]: I0929 10:01:05.994057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerStarted","Data":"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe"} Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.000488 4922 generic.go:334] "Generic (PLEG): container finished" podID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerID="ef4ca87457a9d1479e96a3f3df770f049dd1ba313ce9da1156241e355f90d3b0" exitCode=0 Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.000847 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" event={"ID":"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63","Type":"ContainerDied","Data":"ef4ca87457a9d1479e96a3f3df770f049dd1ba313ce9da1156241e355f90d3b0"} Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.003104 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc6cd987c-j7ptj" event={"ID":"eaa91436-e778-43bb-b052-6e5e9928a705","Type":"ContainerStarted","Data":"3fefc4643b235bdc75dfcbcbe01b34699dc5180e89085eeee85bb7d7935fc2de"} Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.036267 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nxhsm" podStartSLOduration=4.036239917 podStartE2EDuration="4.036239917s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:06.007478338 +0000 UTC m=+991.373708602" watchObservedRunningTime="2025-09-29 10:01:06.036239917 +0000 UTC m=+991.402470181" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.454248 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.538602 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hq86\" (UniqueName: \"kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.538785 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.538851 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.539022 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.539066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.539086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0\") pod \"7d147102-15c5-4d72-814b-5c5a52263122\" (UID: \"7d147102-15c5-4d72-814b-5c5a52263122\") " Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.546956 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86" (OuterVolumeSpecName: "kube-api-access-8hq86") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "kube-api-access-8hq86". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.571315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config" (OuterVolumeSpecName: "config") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.577395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.578571 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.592301 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.604817 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d147102-15c5-4d72-814b-5c5a52263122" (UID: "7d147102-15c5-4d72-814b-5c5a52263122"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641460 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hq86\" (UniqueName: \"kubernetes.io/projected/7d147102-15c5-4d72-814b-5c5a52263122-kube-api-access-8hq86\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641492 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641503 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641512 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641521 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:06 crc kubenswrapper[4922]: I0929 10:01:06.641556 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d147102-15c5-4d72-814b-5c5a52263122-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.025308 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerStarted","Data":"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca"} Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.025603 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-log" containerID="cri-o://530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" gracePeriod=30 Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.025983 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-httpd" containerID="cri-o://00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" gracePeriod=30 Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.038635 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" event={"ID":"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63","Type":"ContainerStarted","Data":"d0a71cd1c28edd8cecf5b041c4bac087b227f47dfd5c2566f32df38691862579"} Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.038784 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.043344 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" event={"ID":"7d147102-15c5-4d72-814b-5c5a52263122","Type":"ContainerDied","Data":"7781910e65102113253e54104bb0d62a24e276e19d50a41feae4df038e17f6fb"} Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.043394 4922 scope.go:117] "RemoveContainer" containerID="e37b154363b1b0f7188073a01c309da92393685059f98f7cf5f8d40bba409d2a" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.043534 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-rshpw" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.073229 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.073180747 podStartE2EDuration="5.073180747s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:07.052685631 +0000 UTC m=+992.418915895" watchObservedRunningTime="2025-09-29 10:01:07.073180747 +0000 UTC m=+992.439411021" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.099795 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" podStartSLOduration=5.099772568 podStartE2EDuration="5.099772568s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:07.094727781 +0000 UTC m=+992.460958045" watchObservedRunningTime="2025-09-29 10:01:07.099772568 +0000 UTC m=+992.466002822" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.137339 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.149870 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-rshpw"] Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.491699 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d147102-15c5-4d72-814b-5c5a52263122" path="/var/lib/kubelet/pods/7d147102-15c5-4d72-814b-5c5a52263122/volumes" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.824023 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.893293 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42dmb\" (UniqueName: \"kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.893655 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.893691 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.896438 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.896478 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.896530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.896639 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.896665 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run\") pod \"688d8030-da9f-46b5-b4a4-5632d9a31864\" (UID: \"688d8030-da9f-46b5-b4a4-5632d9a31864\") " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.897555 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.897818 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs" (OuterVolumeSpecName: "logs") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.930324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts" (OuterVolumeSpecName: "scripts") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.931118 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.931410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb" (OuterVolumeSpecName: "kube-api-access-42dmb") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "kube-api-access-42dmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.939767 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.963984 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data" (OuterVolumeSpecName: "config-data") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.991526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "688d8030-da9f-46b5-b4a4-5632d9a31864" (UID: "688d8030-da9f-46b5-b4a4-5632d9a31864"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.999262 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.999688 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.999712 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:07 crc kubenswrapper[4922]: I0929 10:01:07.999725 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:07.999742 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:07.999952 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688d8030-da9f-46b5-b4a4-5632d9a31864-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:07.999964 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688d8030-da9f-46b5-b4a4-5632d9a31864-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:07.999975 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42dmb\" (UniqueName: \"kubernetes.io/projected/688d8030-da9f-46b5-b4a4-5632d9a31864-kube-api-access-42dmb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.024438 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086513 4922 generic.go:334] "Generic (PLEG): container finished" podID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerID="00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" exitCode=0 Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086572 4922 generic.go:334] "Generic (PLEG): container finished" podID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerID="530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" exitCode=143 Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086633 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerDied","Data":"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca"} Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086671 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerDied","Data":"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe"} Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086687 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"688d8030-da9f-46b5-b4a4-5632d9a31864","Type":"ContainerDied","Data":"b1f6cc63b14fab631a4a59a4e40925f25050fde4662993f52e01c6beafdf275b"} Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086706 4922 scope.go:117] "RemoveContainer" containerID="00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.086849 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.092121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerStarted","Data":"aed9e66955a463e33799c3dd3159a6a6070116f3fa4f04104362bac086b01ca3"} Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.092364 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-log" containerID="cri-o://8ce56584818ded922b8133ee5269042537847d075733a910e2ede7848a724a91" gracePeriod=30 Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.092914 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-httpd" containerID="cri-o://aed9e66955a463e33799c3dd3159a6a6070116f3fa4f04104362bac086b01ca3" gracePeriod=30 Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.101321 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.126988 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.126961893 podStartE2EDuration="6.126961893s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:08.11172423 +0000 UTC m=+993.477954494" watchObservedRunningTime="2025-09-29 10:01:08.126961893 +0000 UTC m=+993.493192157" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.170908 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.211368 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.226445 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:08 crc kubenswrapper[4922]: E0929 10:01:08.227274 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-log" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227303 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-log" Sep 29 10:01:08 crc kubenswrapper[4922]: E0929 10:01:08.227325 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d147102-15c5-4d72-814b-5c5a52263122" containerName="init" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227333 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d147102-15c5-4d72-814b-5c5a52263122" containerName="init" Sep 29 10:01:08 crc kubenswrapper[4922]: E0929 10:01:08.227366 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-httpd" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227375 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-httpd" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227563 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-httpd" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227576 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" containerName="glance-log" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.227596 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d147102-15c5-4d72-814b-5c5a52263122" containerName="init" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.228708 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.235098 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.235379 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.252556 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306393 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306499 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306535 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8pc\" (UniqueName: \"kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306902 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.306981 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413372 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413471 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413560 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8pc\" (UniqueName: \"kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413648 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413671 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.413732 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.414400 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.415640 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.415939 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.421667 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.431750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.433721 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.434971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.436021 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8pc\" (UniqueName: \"kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.462892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:08 crc kubenswrapper[4922]: I0929 10:01:08.595725 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:09 crc kubenswrapper[4922]: I0929 10:01:09.113859 4922 generic.go:334] "Generic (PLEG): container finished" podID="007f8331-533d-439d-beec-80291b7c4a0d" containerID="aed9e66955a463e33799c3dd3159a6a6070116f3fa4f04104362bac086b01ca3" exitCode=0 Sep 29 10:01:09 crc kubenswrapper[4922]: I0929 10:01:09.113894 4922 generic.go:334] "Generic (PLEG): container finished" podID="007f8331-533d-439d-beec-80291b7c4a0d" containerID="8ce56584818ded922b8133ee5269042537847d075733a910e2ede7848a724a91" exitCode=143 Sep 29 10:01:09 crc kubenswrapper[4922]: I0929 10:01:09.113921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerDied","Data":"aed9e66955a463e33799c3dd3159a6a6070116f3fa4f04104362bac086b01ca3"} Sep 29 10:01:09 crc kubenswrapper[4922]: I0929 10:01:09.113953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerDied","Data":"8ce56584818ded922b8133ee5269042537847d075733a910e2ede7848a724a91"} Sep 29 10:01:09 crc kubenswrapper[4922]: I0929 10:01:09.465409 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688d8030-da9f-46b5-b4a4-5632d9a31864" path="/var/lib/kubelet/pods/688d8030-da9f-46b5-b4a4-5632d9a31864/volumes" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.379114 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.424909 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.426705 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.429935 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.502019 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.502287 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.540926 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58b957f588-sp2bt"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.542972 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.553020 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.561264 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58b957f588-sp2bt"] Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603553 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jx6\" (UniqueName: \"kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603768 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603800 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.603822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706018 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-combined-ca-bundle\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706094 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706157 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-tls-certs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706253 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-scripts\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706282 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706340 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706371 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6vc\" (UniqueName: \"kubernetes.io/projected/84f21d67-d595-4458-871c-e4bbc362b134-kube-api-access-wb6vc\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706429 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-config-data\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706460 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-secret-key\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706480 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jx6\" (UniqueName: \"kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706520 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84f21d67-d595-4458-871c-e4bbc362b134-logs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.706550 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.707181 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.708473 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.708760 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.714578 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.725226 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jx6\" (UniqueName: \"kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.726201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.738615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key\") pod \"horizon-fc765957b-xd4sr\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.759233 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.812132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-tls-certs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.814103 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-scripts\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.814470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-scripts\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.815295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6vc\" (UniqueName: \"kubernetes.io/projected/84f21d67-d595-4458-871c-e4bbc362b134-kube-api-access-wb6vc\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.817002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-config-data\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.817175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-secret-key\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.817200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84f21d67-d595-4458-871c-e4bbc362b134-logs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.817674 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84f21d67-d595-4458-871c-e4bbc362b134-config-data\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.818187 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84f21d67-d595-4458-871c-e4bbc362b134-logs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.818200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-combined-ca-bundle\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.828597 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-tls-certs\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.832026 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-horizon-secret-key\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.834606 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f21d67-d595-4458-871c-e4bbc362b134-combined-ca-bundle\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.836987 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6vc\" (UniqueName: \"kubernetes.io/projected/84f21d67-d595-4458-871c-e4bbc362b134-kube-api-access-wb6vc\") pod \"horizon-58b957f588-sp2bt\" (UID: \"84f21d67-d595-4458-871c-e4bbc362b134\") " pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:11 crc kubenswrapper[4922]: I0929 10:01:11.874807 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:12 crc kubenswrapper[4922]: I0929 10:01:12.993099 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:13 crc kubenswrapper[4922]: I0929 10:01:13.089973 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:01:13 crc kubenswrapper[4922]: I0929 10:01:13.090308 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" containerID="cri-o://f0a3a20c7d636fab40180967310e05f56eb26bfce339fdf3a99619e6715b2558" gracePeriod=10 Sep 29 10:01:14 crc kubenswrapper[4922]: I0929 10:01:14.725498 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Sep 29 10:01:16 crc kubenswrapper[4922]: I0929 10:01:16.197194 4922 generic.go:334] "Generic (PLEG): container finished" podID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerID="f0a3a20c7d636fab40180967310e05f56eb26bfce339fdf3a99619e6715b2558" exitCode=0 Sep 29 10:01:16 crc kubenswrapper[4922]: I0929 10:01:16.197825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" event={"ID":"90d0dcdf-6643-403b-875f-6d1f3fd797c9","Type":"ContainerDied","Data":"f0a3a20c7d636fab40180967310e05f56eb26bfce339fdf3a99619e6715b2558"} Sep 29 10:01:17 crc kubenswrapper[4922]: I0929 10:01:17.216657 4922 generic.go:334] "Generic (PLEG): container finished" podID="43132039-b72b-4380-9db8-571b746d0f0b" containerID="a983506c905121861874a861f7b04dd5f7d41454b3c4bd6730d6849a8d4cc35c" exitCode=0 Sep 29 10:01:17 crc kubenswrapper[4922]: I0929 10:01:17.216729 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7bk8" event={"ID":"43132039-b72b-4380-9db8-571b746d0f0b","Type":"ContainerDied","Data":"a983506c905121861874a861f7b04dd5f7d41454b3c4bd6730d6849a8d4cc35c"} Sep 29 10:01:19 crc kubenswrapper[4922]: I0929 10:01:19.725772 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Sep 29 10:01:21 crc kubenswrapper[4922]: E0929 10:01:21.987733 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:01:21 crc kubenswrapper[4922]: E0929 10:01:21.988523 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fch586h8dh8dh579h678hfh5b4h5bdh647h59hbh596h5d7h54dh5bch654h55h8bh5b5h68chc4h675h556hbfh5b8h5ch5c6h549h65ch66h55bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6nlz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-587c5959fc-qq5nc_openstack(87c15220-301e-4bd3-8dca-cfe5773fd469): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:21 crc kubenswrapper[4922]: E0929 10:01:21.991319 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-587c5959fc-qq5nc" podUID="87c15220-301e-4bd3-8dca-cfe5773fd469" Sep 29 10:01:22 crc kubenswrapper[4922]: E0929 10:01:22.021409 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:01:22 crc kubenswrapper[4922]: E0929 10:01:22.021677 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch6ch549h646h667hf6h65dhb5hf5h4h566h688h655h5d7h56fhb6h54fh5d5h5bbh5b9h55fh58dh7h648hfch58dh64ch6bhcchb8h668h5b7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwwnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d6548cb57-w2b9m_openstack(bfd5a1f1-eb6d-450c-95a3-ca487095f510): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:22 crc kubenswrapper[4922]: E0929 10:01:22.024129 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d6548cb57-w2b9m" podUID="bfd5a1f1-eb6d-450c-95a3-ca487095f510" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.723348 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.723557 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sfbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-jv255_openstack(e4b100f6-2a77-43b8-8942-0e50151142d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.724777 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-jv255" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.744402 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.744590 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh566hbfh7dh5dbh687hf7h65dh68h5c6h5c6h575h97h9dh58ch4h68bh57bhcbh576h647h58dh687hb4h648h5b6h58h649hd6hbh555hdcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qm5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7dc6cd987c-j7ptj_openstack(eaa91436-e778-43bb-b052-6e5e9928a705): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:23 crc kubenswrapper[4922]: E0929 10:01:23.747127 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7dc6cd987c-j7ptj" podUID="eaa91436-e778-43bb-b052-6e5e9928a705" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.833925 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.843816 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.896164 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts\") pod \"87c15220-301e-4bd3-8dca-cfe5773fd469\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.896985 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data\") pod \"87c15220-301e-4bd3-8dca-cfe5773fd469\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.897065 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwwnp\" (UniqueName: \"kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp\") pod \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898582 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts\") pod \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898654 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6nlz\" (UniqueName: \"kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz\") pod \"87c15220-301e-4bd3-8dca-cfe5773fd469\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898725 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key\") pod \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898757 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs\") pod \"87c15220-301e-4bd3-8dca-cfe5773fd469\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898791 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs\") pod \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898827 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key\") pod \"87c15220-301e-4bd3-8dca-cfe5773fd469\" (UID: \"87c15220-301e-4bd3-8dca-cfe5773fd469\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.898954 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data\") pod \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\" (UID: \"bfd5a1f1-eb6d-450c-95a3-ca487095f510\") " Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.897070 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts" (OuterVolumeSpecName: "scripts") pod "87c15220-301e-4bd3-8dca-cfe5773fd469" (UID: "87c15220-301e-4bd3-8dca-cfe5773fd469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.897815 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data" (OuterVolumeSpecName: "config-data") pod "87c15220-301e-4bd3-8dca-cfe5773fd469" (UID: "87c15220-301e-4bd3-8dca-cfe5773fd469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.900170 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs" (OuterVolumeSpecName: "logs") pod "bfd5a1f1-eb6d-450c-95a3-ca487095f510" (UID: "bfd5a1f1-eb6d-450c-95a3-ca487095f510"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.900603 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs" (OuterVolumeSpecName: "logs") pod "87c15220-301e-4bd3-8dca-cfe5773fd469" (UID: "87c15220-301e-4bd3-8dca-cfe5773fd469"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.900720 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts" (OuterVolumeSpecName: "scripts") pod "bfd5a1f1-eb6d-450c-95a3-ca487095f510" (UID: "bfd5a1f1-eb6d-450c-95a3-ca487095f510"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.901099 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data" (OuterVolumeSpecName: "config-data") pod "bfd5a1f1-eb6d-450c-95a3-ca487095f510" (UID: "bfd5a1f1-eb6d-450c-95a3-ca487095f510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.905614 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bfd5a1f1-eb6d-450c-95a3-ca487095f510" (UID: "bfd5a1f1-eb6d-450c-95a3-ca487095f510"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.906262 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz" (OuterVolumeSpecName: "kube-api-access-p6nlz") pod "87c15220-301e-4bd3-8dca-cfe5773fd469" (UID: "87c15220-301e-4bd3-8dca-cfe5773fd469"). InnerVolumeSpecName "kube-api-access-p6nlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.906593 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "87c15220-301e-4bd3-8dca-cfe5773fd469" (UID: "87c15220-301e-4bd3-8dca-cfe5773fd469"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:23 crc kubenswrapper[4922]: I0929 10:01:23.909243 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp" (OuterVolumeSpecName: "kube-api-access-dwwnp") pod "bfd5a1f1-eb6d-450c-95a3-ca487095f510" (UID: "bfd5a1f1-eb6d-450c-95a3-ca487095f510"). InnerVolumeSpecName "kube-api-access-dwwnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004666 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004732 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004747 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwwnp\" (UniqueName: \"kubernetes.io/projected/bfd5a1f1-eb6d-450c-95a3-ca487095f510-kube-api-access-dwwnp\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004758 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6nlz\" (UniqueName: \"kubernetes.io/projected/87c15220-301e-4bd3-8dca-cfe5773fd469-kube-api-access-p6nlz\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004767 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c15220-301e-4bd3-8dca-cfe5773fd469-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004775 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfd5a1f1-eb6d-450c-95a3-ca487095f510-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004784 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd5a1f1-eb6d-450c-95a3-ca487095f510-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004793 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87c15220-301e-4bd3-8dca-cfe5773fd469-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004802 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfd5a1f1-eb6d-450c-95a3-ca487095f510-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.004811 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87c15220-301e-4bd3-8dca-cfe5773fd469-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.319117 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c5959fc-qq5nc" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.319131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c5959fc-qq5nc" event={"ID":"87c15220-301e-4bd3-8dca-cfe5773fd469","Type":"ContainerDied","Data":"ee88b34000d502d2e7f773a9655ac067fa39ae5328cfc49d3319ff86f2cb0630"} Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.324458 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d6548cb57-w2b9m" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.325123 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d6548cb57-w2b9m" event={"ID":"bfd5a1f1-eb6d-450c-95a3-ca487095f510","Type":"ContainerDied","Data":"ff458952d3dcb045fe1eec0eb1140eff482e877f8361eb8202dbe39d45644ec3"} Sep 29 10:01:24 crc kubenswrapper[4922]: E0929 10:01:24.328500 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-jv255" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.443632 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.457309 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d6548cb57-w2b9m"] Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.471816 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:24 crc kubenswrapper[4922]: I0929 10:01:24.481641 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-587c5959fc-qq5nc"] Sep 29 10:01:25 crc kubenswrapper[4922]: I0929 10:01:25.464906 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c15220-301e-4bd3-8dca-cfe5773fd469" path="/var/lib/kubelet/pods/87c15220-301e-4bd3-8dca-cfe5773fd469/volumes" Sep 29 10:01:25 crc kubenswrapper[4922]: I0929 10:01:25.465573 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd5a1f1-eb6d-450c-95a3-ca487095f510" path="/var/lib/kubelet/pods/bfd5a1f1-eb6d-450c-95a3-ca487095f510/volumes" Sep 29 10:01:26 crc kubenswrapper[4922]: E0929 10:01:26.176450 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Sep 29 10:01:26 crc kubenswrapper[4922]: E0929 10:01:26.176759 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58dh68dh9bh68ch7ch76h54bh65fhbdh675hdbh549h57dhd8h599h67bh686h76h578h7fh5f4h675h7h58ch68dh67h576h56dh5dh68ch54dhf5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8wns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b1089c4d-63a4-4d54-892c-d4c08291d4ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:26 crc kubenswrapper[4922]: E0929 10:01:26.746277 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 29 10:01:26 crc kubenswrapper[4922]: E0929 10:01:26.747025 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfb9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dtwl5_openstack(ed54c52a-229b-45f0-8526-19d6ca42237c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:26 crc kubenswrapper[4922]: E0929 10:01:26.748391 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dtwl5" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.834200 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975396 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kczx5\" (UniqueName: \"kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975484 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975574 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975620 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.975986 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts\") pod \"43132039-b72b-4380-9db8-571b746d0f0b\" (UID: \"43132039-b72b-4380-9db8-571b746d0f0b\") " Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.985709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.985794 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.985809 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5" (OuterVolumeSpecName: "kube-api-access-kczx5") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "kube-api-access-kczx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:26 crc kubenswrapper[4922]: I0929 10:01:26.999546 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts" (OuterVolumeSpecName: "scripts") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.013165 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.030136 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data" (OuterVolumeSpecName: "config-data") pod "43132039-b72b-4380-9db8-571b746d0f0b" (UID: "43132039-b72b-4380-9db8-571b746d0f0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078537 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078592 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078606 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kczx5\" (UniqueName: \"kubernetes.io/projected/43132039-b72b-4380-9db8-571b746d0f0b-kube-api-access-kczx5\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078620 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078635 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.078646 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43132039-b72b-4380-9db8-571b746d0f0b-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.356865 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7bk8" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.359234 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7bk8" event={"ID":"43132039-b72b-4380-9db8-571b746d0f0b","Type":"ContainerDied","Data":"5946cfefeceaf6cdb55796e4f06a17f4541e97970c78d0ed5023abc7243cf960"} Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.359317 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5946cfefeceaf6cdb55796e4f06a17f4541e97970c78d0ed5023abc7243cf960" Sep 29 10:01:27 crc kubenswrapper[4922]: E0929 10:01:27.361053 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-dtwl5" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.946052 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c7bk8"] Sep 29 10:01:27 crc kubenswrapper[4922]: I0929 10:01:27.954866 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c7bk8"] Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.040186 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x7rtg"] Sep 29 10:01:28 crc kubenswrapper[4922]: E0929 10:01:28.041287 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43132039-b72b-4380-9db8-571b746d0f0b" containerName="keystone-bootstrap" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.044004 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="43132039-b72b-4380-9db8-571b746d0f0b" containerName="keystone-bootstrap" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.044800 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="43132039-b72b-4380-9db8-571b746d0f0b" containerName="keystone-bootstrap" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.045686 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.048003 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7rtg"] Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.048658 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-94nrr" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.048997 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.049177 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.049564 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204197 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204340 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204404 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn4v8\" (UniqueName: \"kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.204425 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.306788 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.306925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.307001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.307065 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.307151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn4v8\" (UniqueName: \"kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.307176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.314066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.314169 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.314525 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.314892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.315651 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.329529 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn4v8\" (UniqueName: \"kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8\") pod \"keystone-bootstrap-x7rtg\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:28 crc kubenswrapper[4922]: I0929 10:01:28.368120 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:29 crc kubenswrapper[4922]: I0929 10:01:29.465366 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43132039-b72b-4380-9db8-571b746d0f0b" path="/var/lib/kubelet/pods/43132039-b72b-4380-9db8-571b746d0f0b/volumes" Sep 29 10:01:29 crc kubenswrapper[4922]: I0929 10:01:29.726371 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Sep 29 10:01:29 crc kubenswrapper[4922]: I0929 10:01:29.726985 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:01:33 crc kubenswrapper[4922]: I0929 10:01:33.151969 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:01:33 crc kubenswrapper[4922]: I0929 10:01:33.152526 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:01:34 crc kubenswrapper[4922]: I0929 10:01:34.727083 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Sep 29 10:01:34 crc kubenswrapper[4922]: I0929 10:01:34.946488 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:34 crc kubenswrapper[4922]: I0929 10:01:34.953932 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:34 crc kubenswrapper[4922]: I0929 10:01:34.968660 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.064857 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.064946 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qm5x\" (UniqueName: \"kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x\") pod \"eaa91436-e778-43bb-b052-6e5e9928a705\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.064986 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065042 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data\") pod \"eaa91436-e778-43bb-b052-6e5e9928a705\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065162 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzc8r\" (UniqueName: \"kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065182 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key\") pod \"eaa91436-e778-43bb-b052-6e5e9928a705\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts\") pod \"eaa91436-e778-43bb-b052-6e5e9928a705\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065478 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065553 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065585 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065611 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs\") pod \"eaa91436-e778-43bb-b052-6e5e9928a705\" (UID: \"eaa91436-e778-43bb-b052-6e5e9928a705\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb\") pod \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\" (UID: \"90d0dcdf-6643-403b-875f-6d1f3fd797c9\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065687 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065731 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-577w9\" (UniqueName: \"kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.065796 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle\") pod \"007f8331-533d-439d-beec-80291b7c4a0d\" (UID: \"007f8331-533d-439d-beec-80291b7c4a0d\") " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.066628 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs" (OuterVolumeSpecName: "logs") pod "eaa91436-e778-43bb-b052-6e5e9928a705" (UID: "eaa91436-e778-43bb-b052-6e5e9928a705"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.067023 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa91436-e778-43bb-b052-6e5e9928a705-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.069356 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.069955 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs" (OuterVolumeSpecName: "logs") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.070139 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data" (OuterVolumeSpecName: "config-data") pod "eaa91436-e778-43bb-b052-6e5e9928a705" (UID: "eaa91436-e778-43bb-b052-6e5e9928a705"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.075609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts" (OuterVolumeSpecName: "scripts") pod "eaa91436-e778-43bb-b052-6e5e9928a705" (UID: "eaa91436-e778-43bb-b052-6e5e9928a705"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.076027 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts" (OuterVolumeSpecName: "scripts") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.076165 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x" (OuterVolumeSpecName: "kube-api-access-9qm5x") pod "eaa91436-e778-43bb-b052-6e5e9928a705" (UID: "eaa91436-e778-43bb-b052-6e5e9928a705"). InnerVolumeSpecName "kube-api-access-9qm5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.083009 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9" (OuterVolumeSpecName: "kube-api-access-577w9") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "kube-api-access-577w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.083115 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r" (OuterVolumeSpecName: "kube-api-access-xzc8r") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "kube-api-access-xzc8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.083230 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.094526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "eaa91436-e778-43bb-b052-6e5e9928a705" (UID: "eaa91436-e778-43bb-b052-6e5e9928a705"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.128430 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.142457 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data" (OuterVolumeSpecName: "config-data") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.155974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "007f8331-533d-439d-beec-80291b7c4a0d" (UID: "007f8331-533d-439d-beec-80291b7c4a0d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.162817 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config" (OuterVolumeSpecName: "config") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170112 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170190 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170208 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170224 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170237 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-577w9\" (UniqueName: \"kubernetes.io/projected/007f8331-533d-439d-beec-80291b7c4a0d-kube-api-access-577w9\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170249 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170261 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007f8331-533d-439d-beec-80291b7c4a0d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170272 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qm5x\" (UniqueName: \"kubernetes.io/projected/eaa91436-e778-43bb-b052-6e5e9928a705-kube-api-access-9qm5x\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170283 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170296 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170308 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007f8331-533d-439d-beec-80291b7c4a0d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170323 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzc8r\" (UniqueName: \"kubernetes.io/projected/90d0dcdf-6643-403b-875f-6d1f3fd797c9-kube-api-access-xzc8r\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170337 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eaa91436-e778-43bb-b052-6e5e9928a705-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.170348 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaa91436-e778-43bb-b052-6e5e9928a705-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.172853 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.179549 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.186667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.188220 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90d0dcdf-6643-403b-875f-6d1f3fd797c9" (UID: "90d0dcdf-6643-403b-875f-6d1f3fd797c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.206591 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.273217 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.273276 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.273292 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.273304 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.273317 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90d0dcdf-6643-403b-875f-6d1f3fd797c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.444197 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"007f8331-533d-439d-beec-80291b7c4a0d","Type":"ContainerDied","Data":"31b3ace0c13d87d6280891caee7af8e2c40dd37cf6b6dd2af6a395dc4d5272cf"} Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.444390 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.458694 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc6cd987c-j7ptj" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.464976 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.473043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc6cd987c-j7ptj" event={"ID":"eaa91436-e778-43bb-b052-6e5e9928a705","Type":"ContainerDied","Data":"3fefc4643b235bdc75dfcbcbe01b34699dc5180e89085eeee85bb7d7935fc2de"} Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.473097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" event={"ID":"90d0dcdf-6643-403b-875f-6d1f3fd797c9","Type":"ContainerDied","Data":"6d9f1bab775f14a7a0f1d24d2da7fe580f29177cfca2a70ccee376e73393bc9a"} Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.502611 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.513484 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.545736 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:35 crc kubenswrapper[4922]: E0929 10:01:35.546481 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-httpd" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546507 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-httpd" Sep 29 10:01:35 crc kubenswrapper[4922]: E0929 10:01:35.546532 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="init" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546542 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="init" Sep 29 10:01:35 crc kubenswrapper[4922]: E0929 10:01:35.546571 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-log" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546579 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-log" Sep 29 10:01:35 crc kubenswrapper[4922]: E0929 10:01:35.546597 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546605 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546893 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546939 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-log" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.546965 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="007f8331-533d-439d-beec-80291b7c4a0d" containerName="glance-httpd" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.549270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.553752 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.553795 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.588240 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.616717 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.624314 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dc6cd987c-j7ptj"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.631534 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.636374 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-4gkdr"] Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.682809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.682870 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.682899 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmr9\" (UniqueName: \"kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.682960 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.683187 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.683325 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.683374 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.683526 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.785942 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.785998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmr9\" (UniqueName: \"kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786061 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786130 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.786179 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.787341 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.787737 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.788687 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.791484 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.792847 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.793219 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.793460 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.806752 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmr9\" (UniqueName: \"kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.829572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " pod="openstack/glance-default-external-api-0" Sep 29 10:01:35 crc kubenswrapper[4922]: I0929 10:01:35.874935 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.261252 4922 scope.go:117] "RemoveContainer" containerID="530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.284865 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.285705 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trx5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7lxww_openstack(0c2d9bba-864b-468d-923e-23cf0544daf9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.289288 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7lxww" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.548658 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-7lxww" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.875604 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58b957f588-sp2bt"] Sep 29 10:01:36 crc kubenswrapper[4922]: W0929 10:01:36.910047 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f21d67_d595_4458_871c_e4bbc362b134.slice/crio-9c3a1f9417cc3e4aea2ff75f84e49bf2237f38e18526eeef993a9e1d13159916 WatchSource:0}: Error finding container 9c3a1f9417cc3e4aea2ff75f84e49bf2237f38e18526eeef993a9e1d13159916: Status 404 returned error can't find the container with id 9c3a1f9417cc3e4aea2ff75f84e49bf2237f38e18526eeef993a9e1d13159916 Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.921259 4922 scope.go:117] "RemoveContainer" containerID="00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.923059 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca\": container with ID starting with 00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca not found: ID does not exist" containerID="00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.923115 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca"} err="failed to get container status \"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca\": rpc error: code = NotFound desc = could not find container \"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca\": container with ID starting with 00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca not found: ID does not exist" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.923149 4922 scope.go:117] "RemoveContainer" containerID="530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" Sep 29 10:01:36 crc kubenswrapper[4922]: E0929 10:01:36.924055 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe\": container with ID starting with 530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe not found: ID does not exist" containerID="530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.924936 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe"} err="failed to get container status \"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe\": rpc error: code = NotFound desc = could not find container \"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe\": container with ID starting with 530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe not found: ID does not exist" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.924990 4922 scope.go:117] "RemoveContainer" containerID="00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.925780 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca"} err="failed to get container status \"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca\": rpc error: code = NotFound desc = could not find container \"00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca\": container with ID starting with 00bd6ed2075a117192aad07fb96c61068e770fe8ba8284be072d76b2706fd0ca not found: ID does not exist" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.925807 4922 scope.go:117] "RemoveContainer" containerID="530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.926384 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe"} err="failed to get container status \"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe\": rpc error: code = NotFound desc = could not find container \"530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe\": container with ID starting with 530033f557e079b400384a693dde8337a913283124e83237728454f335f40ebe not found: ID does not exist" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.926427 4922 scope.go:117] "RemoveContainer" containerID="aed9e66955a463e33799c3dd3159a6a6070116f3fa4f04104362bac086b01ca3" Sep 29 10:01:36 crc kubenswrapper[4922]: I0929 10:01:36.983113 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:01:37 crc kubenswrapper[4922]: W0929 10:01:36.999911 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63e97c2_45d9_4b32_9b0e_1449fad249e6.slice/crio-de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394 WatchSource:0}: Error finding container de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394: Status 404 returned error can't find the container with id de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394 Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.080458 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:37 crc kubenswrapper[4922]: W0929 10:01:37.115405 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f703db3_c82c_47f4_9d61_ca23afacae5a.slice/crio-32cb5a608c5ddd5b17a58e22f57db689cbc371821987f6e9462e4ced1645191b WatchSource:0}: Error finding container 32cb5a608c5ddd5b17a58e22f57db689cbc371821987f6e9462e4ced1645191b: Status 404 returned error can't find the container with id 32cb5a608c5ddd5b17a58e22f57db689cbc371821987f6e9462e4ced1645191b Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.158074 4922 scope.go:117] "RemoveContainer" containerID="8ce56584818ded922b8133ee5269042537847d075733a910e2ede7848a724a91" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.228669 4922 scope.go:117] "RemoveContainer" containerID="f0a3a20c7d636fab40180967310e05f56eb26bfce339fdf3a99619e6715b2558" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.258530 4922 scope.go:117] "RemoveContainer" containerID="dcaad16f64c31493de3f7cf2e3fd7264a73357388e3cdb51c94e13195e6483d7" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.415483 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7rtg"] Sep 29 10:01:37 crc kubenswrapper[4922]: W0929 10:01:37.418822 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8fc131_e5f1_45f2_9c6c_1050ca368d5a.slice/crio-2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df WatchSource:0}: Error finding container 2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df: Status 404 returned error can't find the container with id 2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.464319 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007f8331-533d-439d-beec-80291b7c4a0d" path="/var/lib/kubelet/pods/007f8331-533d-439d-beec-80291b7c4a0d/volumes" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.465167 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" path="/var/lib/kubelet/pods/90d0dcdf-6643-403b-875f-6d1f3fd797c9/volumes" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.466601 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa91436-e778-43bb-b052-6e5e9928a705" path="/var/lib/kubelet/pods/eaa91436-e778-43bb-b052-6e5e9928a705/volumes" Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.565165 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerStarted","Data":"2161c77ca8997d8ca98f140447a3ddb270a71d4c1bbab0e07342af8a572264ad"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.566646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58b957f588-sp2bt" event={"ID":"84f21d67-d595-4458-871c-e4bbc362b134","Type":"ContainerStarted","Data":"9c3a1f9417cc3e4aea2ff75f84e49bf2237f38e18526eeef993a9e1d13159916"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.580069 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7rtg" event={"ID":"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a","Type":"ContainerStarted","Data":"2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.582116 4922 generic.go:334] "Generic (PLEG): container finished" podID="27ae426f-29ac-46dd-a865-41b4c4a0e722" containerID="0bea57f17394c642e23109b448bdbb1ffe261d1c274d5716f320f14a775b8168" exitCode=0 Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.582182 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nxhsm" event={"ID":"27ae426f-29ac-46dd-a865-41b4c4a0e722","Type":"ContainerDied","Data":"0bea57f17394c642e23109b448bdbb1ffe261d1c274d5716f320f14a775b8168"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.584362 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerStarted","Data":"32cb5a608c5ddd5b17a58e22f57db689cbc371821987f6e9462e4ced1645191b"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.591641 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerStarted","Data":"de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394"} Sep 29 10:01:37 crc kubenswrapper[4922]: I0929 10:01:37.609113 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:01:37 crc kubenswrapper[4922]: W0929 10:01:37.614165 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f597643_851f_448e_996d_5a30b83c535f.slice/crio-3940b92a157226472b741e8629dcd006c90e31c6b7ae179fb0165e6bb3ae381e WatchSource:0}: Error finding container 3940b92a157226472b741e8629dcd006c90e31c6b7ae179fb0165e6bb3ae381e: Status 404 returned error can't find the container with id 3940b92a157226472b741e8629dcd006c90e31c6b7ae179fb0165e6bb3ae381e Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.624202 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58b957f588-sp2bt" event={"ID":"84f21d67-d595-4458-871c-e4bbc362b134","Type":"ContainerStarted","Data":"669d351a7b45130a01ea2754a937e54c4f07d56668bfddfb536890c96e656370"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.625019 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58b957f588-sp2bt" event={"ID":"84f21d67-d595-4458-871c-e4bbc362b134","Type":"ContainerStarted","Data":"797bebf5656c38b61601657174608e0ec6b3bcceae6ac81d21558ac50f9e3f00"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.632744 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerStarted","Data":"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.632792 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerStarted","Data":"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.632909 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-httpd" containerID="cri-o://cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" gracePeriod=30 Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.633020 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-log" containerID="cri-o://c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" gracePeriod=30 Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.639742 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerStarted","Data":"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.639797 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerStarted","Data":"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.643230 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7rtg" event={"ID":"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a","Type":"ContainerStarted","Data":"6e9d93b9fd6a0f4de8cd71ea385314346b5733cb664ab08be96cfdeb215db6a7"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.656023 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58b957f588-sp2bt" podStartSLOduration=26.627555481 podStartE2EDuration="27.656000811s" podCreationTimestamp="2025-09-29 10:01:11 +0000 UTC" firstStartedPulling="2025-09-29 10:01:36.948800435 +0000 UTC m=+1022.315030699" lastFinishedPulling="2025-09-29 10:01:37.977245765 +0000 UTC m=+1023.343476029" observedRunningTime="2025-09-29 10:01:38.650218564 +0000 UTC m=+1024.016448828" watchObservedRunningTime="2025-09-29 10:01:38.656000811 +0000 UTC m=+1024.022231075" Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.656097 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerStarted","Data":"0acd7c1aa28e817ab6423aae8c924696b3f2cc5b923255246f1c10e2f20ab09b"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.656205 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerStarted","Data":"3940b92a157226472b741e8629dcd006c90e31c6b7ae179fb0165e6bb3ae381e"} Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.719395 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.719369329 podStartE2EDuration="30.719369329s" podCreationTimestamp="2025-09-29 10:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:38.676406204 +0000 UTC m=+1024.042636488" watchObservedRunningTime="2025-09-29 10:01:38.719369329 +0000 UTC m=+1024.085599593" Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.719907 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x7rtg" podStartSLOduration=10.719902853 podStartE2EDuration="10.719902853s" podCreationTimestamp="2025-09-29 10:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:38.695929943 +0000 UTC m=+1024.062160207" watchObservedRunningTime="2025-09-29 10:01:38.719902853 +0000 UTC m=+1024.086133107" Sep 29 10:01:38 crc kubenswrapper[4922]: I0929 10:01:38.733612 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fc765957b-xd4sr" podStartSLOduration=27.122719169 podStartE2EDuration="27.733588514s" podCreationTimestamp="2025-09-29 10:01:11 +0000 UTC" firstStartedPulling="2025-09-29 10:01:37.003751146 +0000 UTC m=+1022.369981410" lastFinishedPulling="2025-09-29 10:01:37.614620491 +0000 UTC m=+1022.980850755" observedRunningTime="2025-09-29 10:01:38.717283993 +0000 UTC m=+1024.083514247" watchObservedRunningTime="2025-09-29 10:01:38.733588514 +0000 UTC m=+1024.099818778" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.218784 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.249880 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.378637 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.378894 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.380339 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.380931 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsrmw\" (UniqueName: \"kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw\") pod \"27ae426f-29ac-46dd-a865-41b4c4a0e722\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.381533 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj8pc\" (UniqueName: \"kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382011 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382045 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382142 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config\") pod \"27ae426f-29ac-46dd-a865-41b4c4a0e722\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382225 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382253 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9f703db3-c82c-47f4-9d61-ca23afacae5a\" (UID: \"9f703db3-c82c-47f4-9d61-ca23afacae5a\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.382330 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle\") pod \"27ae426f-29ac-46dd-a865-41b4c4a0e722\" (UID: \"27ae426f-29ac-46dd-a865-41b4c4a0e722\") " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.383349 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.384351 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs" (OuterVolumeSpecName: "logs") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.388025 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts" (OuterVolumeSpecName: "scripts") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.389036 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw" (OuterVolumeSpecName: "kube-api-access-dsrmw") pod "27ae426f-29ac-46dd-a865-41b4c4a0e722" (UID: "27ae426f-29ac-46dd-a865-41b4c4a0e722"). InnerVolumeSpecName "kube-api-access-dsrmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.389258 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc" (OuterVolumeSpecName: "kube-api-access-zj8pc") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "kube-api-access-zj8pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.401240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.438724 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config" (OuterVolumeSpecName: "config") pod "27ae426f-29ac-46dd-a865-41b4c4a0e722" (UID: "27ae426f-29ac-46dd-a865-41b4c4a0e722"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.446440 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.449979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27ae426f-29ac-46dd-a865-41b4c4a0e722" (UID: "27ae426f-29ac-46dd-a865-41b4c4a0e722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.463369 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data" (OuterVolumeSpecName: "config-data") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.473001 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9f703db3-c82c-47f4-9d61-ca23afacae5a" (UID: "9f703db3-c82c-47f4-9d61-ca23afacae5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485700 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485732 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485743 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsrmw\" (UniqueName: \"kubernetes.io/projected/27ae426f-29ac-46dd-a865-41b4c4a0e722-kube-api-access-dsrmw\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485756 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj8pc\" (UniqueName: \"kubernetes.io/projected/9f703db3-c82c-47f4-9d61-ca23afacae5a-kube-api-access-zj8pc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485765 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485775 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485784 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f703db3-c82c-47f4-9d61-ca23afacae5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485795 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/27ae426f-29ac-46dd-a865-41b4c4a0e722-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485805 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f703db3-c82c-47f4-9d61-ca23afacae5a-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.485842 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.507357 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.588534 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684082 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerID="cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" exitCode=143 Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684130 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerID="c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" exitCode=143 Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684175 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684205 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerDied","Data":"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684245 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerDied","Data":"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684256 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f703db3-c82c-47f4-9d61-ca23afacae5a","Type":"ContainerDied","Data":"32cb5a608c5ddd5b17a58e22f57db689cbc371821987f6e9462e4ced1645191b"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.684285 4922 scope.go:117] "RemoveContainer" containerID="cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.696619 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv255" event={"ID":"e4b100f6-2a77-43b8-8942-0e50151142d0","Type":"ContainerStarted","Data":"387201c090373052742813abe5c2c5cff6e3729873fba0d13266f1c498a49319"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.717959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nxhsm" event={"ID":"27ae426f-29ac-46dd-a865-41b4c4a0e722","Type":"ContainerDied","Data":"728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.718410 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="728e1b5acbd3dc9b2cdcbaf1481fba6c4bd64ec1e08222b9957492bd28c0ef8e" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.718626 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nxhsm" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.733267 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-4gkdr" podUID="90d0dcdf-6643-403b-875f-6d1f3fd797c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.744262 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jv255" podStartSLOduration=3.308632837 podStartE2EDuration="37.744241721s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="2025-09-29 10:01:04.440511716 +0000 UTC m=+989.806741980" lastFinishedPulling="2025-09-29 10:01:38.8761206 +0000 UTC m=+1024.242350864" observedRunningTime="2025-09-29 10:01:39.731374382 +0000 UTC m=+1025.097604646" watchObservedRunningTime="2025-09-29 10:01:39.744241721 +0000 UTC m=+1025.110471985" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.744959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerStarted","Data":"60256daf381d89ac8700d2ba4428ff35b3b18e65d35458e0a160ec52ccfed346"} Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.788917 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.802085 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.829969 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.829949035 podStartE2EDuration="4.829949035s" podCreationTimestamp="2025-09-29 10:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:39.816505821 +0000 UTC m=+1025.182736085" watchObservedRunningTime="2025-09-29 10:01:39.829949035 +0000 UTC m=+1025.196179299" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.853990 4922 scope.go:117] "RemoveContainer" containerID="c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.894999 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:01:39 crc kubenswrapper[4922]: E0929 10:01:39.895422 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-httpd" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895434 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-httpd" Sep 29 10:01:39 crc kubenswrapper[4922]: E0929 10:01:39.895464 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ae426f-29ac-46dd-a865-41b4c4a0e722" containerName="neutron-db-sync" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895470 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ae426f-29ac-46dd-a865-41b4c4a0e722" containerName="neutron-db-sync" Sep 29 10:01:39 crc kubenswrapper[4922]: E0929 10:01:39.895481 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-log" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895487 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-log" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895657 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-log" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895674 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ae426f-29ac-46dd-a865-41b4c4a0e722" containerName="neutron-db-sync" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.895694 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" containerName="glance-httpd" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.909096 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.909241 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.911965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.914136 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.914447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.925184 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.934465 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.981734 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.984078 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.988397 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.988696 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.989036 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 10:01:39 crc kubenswrapper[4922]: I0929 10:01:39.989312 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ldrm5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001654 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001746 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001846 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001918 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.001974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002002 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjtq\" (UniqueName: \"kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002023 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002062 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002230 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4jq\" (UniqueName: \"kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002255 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.002351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.004493 4922 scope.go:117] "RemoveContainer" containerID="cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" Sep 29 10:01:40 crc kubenswrapper[4922]: E0929 10:01:40.017016 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d\": container with ID starting with cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d not found: ID does not exist" containerID="cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.017077 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d"} err="failed to get container status \"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d\": rpc error: code = NotFound desc = could not find container \"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d\": container with ID starting with cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d not found: ID does not exist" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.017114 4922 scope.go:117] "RemoveContainer" containerID="c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" Sep 29 10:01:40 crc kubenswrapper[4922]: E0929 10:01:40.017730 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e\": container with ID starting with c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e not found: ID does not exist" containerID="c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.017758 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e"} err="failed to get container status \"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e\": rpc error: code = NotFound desc = could not find container \"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e\": container with ID starting with c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e not found: ID does not exist" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.017777 4922 scope.go:117] "RemoveContainer" containerID="cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.022947 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d"} err="failed to get container status \"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d\": rpc error: code = NotFound desc = could not find container \"cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d\": container with ID starting with cbb795578d264b9ad6a5178cde22c265270e661e790c8dfb575b4e0a163c505d not found: ID does not exist" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.022986 4922 scope.go:117] "RemoveContainer" containerID="c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.023337 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e"} err="failed to get container status \"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e\": rpc error: code = NotFound desc = could not find container \"c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e\": container with ID starting with c12c2eee1737dcf85a50d75f4f2f4516fd10a83dd466c75de0b7fc7102a1f17e not found: ID does not exist" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.037040 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.103863 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.103931 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.103966 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.103986 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104019 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjtq\" (UniqueName: \"kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104131 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104150 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104197 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104261 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104283 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4jq\" (UniqueName: \"kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104301 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104333 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104354 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqq8t\" (UniqueName: \"kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104382 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.104438 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.105557 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.107327 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.108018 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.110952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.112172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.113653 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.114430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.115179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.115881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.117281 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.117396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.117780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.131328 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjtq\" (UniqueName: \"kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.138657 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4jq\" (UniqueName: \"kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq\") pod \"dnsmasq-dns-84b966f6c9-8ntc5\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.174295 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.208024 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.208577 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.208631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqq8t\" (UniqueName: \"kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.208673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.208949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.217938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.221290 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.225792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.237174 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.239768 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqq8t\" (UniqueName: \"kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t\") pod \"neutron-56694df76d-5npfb\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.353950 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.373514 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:40 crc kubenswrapper[4922]: I0929 10:01:40.382068 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.014952 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.099545 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:01:41 crc kubenswrapper[4922]: W0929 10:01:41.162088 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c70da77_2c81_490c_a16f_91680ebea9b5.slice/crio-4a38f2e4fe6dd34a94007a39ebf2a5c70c2b317089403cb81feed80b5b0b107b WatchSource:0}: Error finding container 4a38f2e4fe6dd34a94007a39ebf2a5c70c2b317089403cb81feed80b5b0b107b: Status 404 returned error can't find the container with id 4a38f2e4fe6dd34a94007a39ebf2a5c70c2b317089403cb81feed80b5b0b107b Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.227024 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.478130 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f703db3-c82c-47f4-9d61-ca23afacae5a" path="/var/lib/kubelet/pods/9f703db3-c82c-47f4-9d61-ca23afacae5a/volumes" Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.760090 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.760645 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.802255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerStarted","Data":"fba3fec3edcff73a297bbd9b9322e0cd187f6d7a1c7e19c2e62580935c4f9eaf"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.806039 4922 generic.go:334] "Generic (PLEG): container finished" podID="08defa73-766c-460e-97b9-ca7a1194230d" containerID="ec693b0837d4065fa01a22507007b0483a3de82c7b5d8b40e0121758ecf79e5c" exitCode=0 Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.806383 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" event={"ID":"08defa73-766c-460e-97b9-ca7a1194230d","Type":"ContainerDied","Data":"ec693b0837d4065fa01a22507007b0483a3de82c7b5d8b40e0121758ecf79e5c"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.806433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" event={"ID":"08defa73-766c-460e-97b9-ca7a1194230d","Type":"ContainerStarted","Data":"e511a577f9169706cf94835042017b497d7f1d331618a985d2fbbc1fae27830e"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.808442 4922 generic.go:334] "Generic (PLEG): container finished" podID="1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" containerID="6e9d93b9fd6a0f4de8cd71ea385314346b5733cb664ab08be96cfdeb215db6a7" exitCode=0 Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.808522 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7rtg" event={"ID":"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a","Type":"ContainerDied","Data":"6e9d93b9fd6a0f4de8cd71ea385314346b5733cb664ab08be96cfdeb215db6a7"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.826348 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerStarted","Data":"fe31755261ee819f5b1ed9789272133534028a97c2064da8e6ed0c950ba4e354"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.826425 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerStarted","Data":"4a38f2e4fe6dd34a94007a39ebf2a5c70c2b317089403cb81feed80b5b0b107b"} Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.875155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:41 crc kubenswrapper[4922]: I0929 10:01:41.877126 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.713718 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8b5fcf5f9-p74mm"] Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.716636 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.719417 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.720395 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.740544 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8b5fcf5f9-p74mm"] Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.775278 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-ovndb-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.775621 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-combined-ca-bundle\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.775805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-internal-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.775943 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-public-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.776017 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-httpd-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.776215 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2l2\" (UniqueName: \"kubernetes.io/projected/59b8f377-8449-49f6-992b-6b76ef613283-kube-api-access-rv2l2\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.776316 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.860588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerStarted","Data":"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694"} Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.863328 4922 generic.go:334] "Generic (PLEG): container finished" podID="e4b100f6-2a77-43b8-8942-0e50151142d0" containerID="387201c090373052742813abe5c2c5cff6e3729873fba0d13266f1c498a49319" exitCode=0 Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.863455 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv255" event={"ID":"e4b100f6-2a77-43b8-8942-0e50151142d0","Type":"ContainerDied","Data":"387201c090373052742813abe5c2c5cff6e3729873fba0d13266f1c498a49319"} Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878433 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-combined-ca-bundle\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-internal-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878532 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-public-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-httpd-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878601 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2l2\" (UniqueName: \"kubernetes.io/projected/59b8f377-8449-49f6-992b-6b76ef613283-kube-api-access-rv2l2\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878634 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.878676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-ovndb-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.890109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-httpd-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.891677 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-combined-ca-bundle\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.893332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-internal-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.893506 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-ovndb-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.893645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-public-tls-certs\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.898748 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/59b8f377-8449-49f6-992b-6b76ef613283-config\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:42 crc kubenswrapper[4922]: I0929 10:01:42.902875 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2l2\" (UniqueName: \"kubernetes.io/projected/59b8f377-8449-49f6-992b-6b76ef613283-kube-api-access-rv2l2\") pod \"neutron-8b5fcf5f9-p74mm\" (UID: \"59b8f377-8449-49f6-992b-6b76ef613283\") " pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:43 crc kubenswrapper[4922]: I0929 10:01:43.037193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:45 crc kubenswrapper[4922]: I0929 10:01:45.875494 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:01:45 crc kubenswrapper[4922]: I0929 10:01:45.876339 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:01:45 crc kubenswrapper[4922]: I0929 10:01:45.914475 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:01:45 crc kubenswrapper[4922]: I0929 10:01:45.915947 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:01:45 crc kubenswrapper[4922]: I0929 10:01:45.953679 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:01:46 crc kubenswrapper[4922]: I0929 10:01:46.917267 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.439739 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv255" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.442521 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.506665 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data\") pod \"e4b100f6-2a77-43b8-8942-0e50151142d0\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507168 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507225 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507268 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle\") pod \"e4b100f6-2a77-43b8-8942-0e50151142d0\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507315 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507444 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs\") pod \"e4b100f6-2a77-43b8-8942-0e50151142d0\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts\") pod \"e4b100f6-2a77-43b8-8942-0e50151142d0\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507549 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507586 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn4v8\" (UniqueName: \"kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507620 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfbw\" (UniqueName: \"kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw\") pod \"e4b100f6-2a77-43b8-8942-0e50151142d0\" (UID: \"e4b100f6-2a77-43b8-8942-0e50151142d0\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.507662 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts\") pod \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\" (UID: \"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a\") " Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.518116 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts" (OuterVolumeSpecName: "scripts") pod "e4b100f6-2a77-43b8-8942-0e50151142d0" (UID: "e4b100f6-2a77-43b8-8942-0e50151142d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.518539 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs" (OuterVolumeSpecName: "logs") pod "e4b100f6-2a77-43b8-8942-0e50151142d0" (UID: "e4b100f6-2a77-43b8-8942-0e50151142d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.519415 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8" (OuterVolumeSpecName: "kube-api-access-wn4v8") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "kube-api-access-wn4v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.521721 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.526239 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw" (OuterVolumeSpecName: "kube-api-access-5sfbw") pod "e4b100f6-2a77-43b8-8942-0e50151142d0" (UID: "e4b100f6-2a77-43b8-8942-0e50151142d0"). InnerVolumeSpecName "kube-api-access-5sfbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.579575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts" (OuterVolumeSpecName: "scripts") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.579674 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.584389 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data" (OuterVolumeSpecName: "config-data") pod "e4b100f6-2a77-43b8-8942-0e50151142d0" (UID: "e4b100f6-2a77-43b8-8942-0e50151142d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.585268 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.585999 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4b100f6-2a77-43b8-8942-0e50151142d0" (UID: "e4b100f6-2a77-43b8-8942-0e50151142d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.590968 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data" (OuterVolumeSpecName: "config-data") pod "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" (UID: "1b8fc131-e5f1-45f2-9c6c-1050ca368d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613731 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b100f6-2a77-43b8-8942-0e50151142d0-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613763 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613776 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613790 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn4v8\" (UniqueName: \"kubernetes.io/projected/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-kube-api-access-wn4v8\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613804 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfbw\" (UniqueName: \"kubernetes.io/projected/e4b100f6-2a77-43b8-8942-0e50151142d0-kube-api-access-5sfbw\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613813 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613822 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613844 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613854 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613864 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b100f6-2a77-43b8-8942-0e50151142d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.613873 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.955357 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" event={"ID":"08defa73-766c-460e-97b9-ca7a1194230d","Type":"ContainerStarted","Data":"6082321864e7c7057b85d91354c4b75d44321f80d7bcbd2172b58e7eb397997f"} Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.957235 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.971450 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7rtg" event={"ID":"1b8fc131-e5f1-45f2-9c6c-1050ca368d5a","Type":"ContainerDied","Data":"2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df"} Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.971515 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ec416bb1d5ae24322bd5e6877e898efc304b50b21fe373e2d80328c9f0328df" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.971641 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7rtg" Sep 29 10:01:47 crc kubenswrapper[4922]: I0929 10:01:47.985038 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" podStartSLOduration=8.985018279 podStartE2EDuration="8.985018279s" podCreationTimestamp="2025-09-29 10:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:47.982970573 +0000 UTC m=+1033.349200837" watchObservedRunningTime="2025-09-29 10:01:47.985018279 +0000 UTC m=+1033.351248543" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.000507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerStarted","Data":"a1fbbafc63e8d3c19bafc6b82450832e0052d711c0d1d09ca40286e1866e0175"} Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.000812 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.007574 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.008096 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jv255" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.009235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jv255" event={"ID":"e4b100f6-2a77-43b8-8942-0e50151142d0","Type":"ContainerDied","Data":"a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b"} Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.009281 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a857d363570297c12fe8b96f859391c1a004f62f463a63770da5251b95cfe82b" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.024945 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56694df76d-5npfb" podStartSLOduration=9.024917371 podStartE2EDuration="9.024917371s" podCreationTimestamp="2025-09-29 10:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:48.023597475 +0000 UTC m=+1033.389827739" watchObservedRunningTime="2025-09-29 10:01:48.024917371 +0000 UTC m=+1033.391147635" Sep 29 10:01:48 crc kubenswrapper[4922]: W0929 10:01:48.157075 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b8f377_8449_49f6_992b_6b76ef613283.slice/crio-dcac413eca1f520861d1ea52ce8bfca739d3f10af7149d9ad66d33ddbac19f41 WatchSource:0}: Error finding container dcac413eca1f520861d1ea52ce8bfca739d3f10af7149d9ad66d33ddbac19f41: Status 404 returned error can't find the container with id dcac413eca1f520861d1ea52ce8bfca739d3f10af7149d9ad66d33ddbac19f41 Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.159534 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8b5fcf5f9-p74mm"] Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.423015 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.609756 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7fc9c79bdd-vqp6p"] Sep 29 10:01:48 crc kubenswrapper[4922]: E0929 10:01:48.610922 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" containerName="keystone-bootstrap" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.610954 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" containerName="keystone-bootstrap" Sep 29 10:01:48 crc kubenswrapper[4922]: E0929 10:01:48.610989 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" containerName="placement-db-sync" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.611000 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" containerName="placement-db-sync" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.611243 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" containerName="keystone-bootstrap" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.611280 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" containerName="placement-db-sync" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.612691 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.616797 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z57ns" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.623117 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.623261 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.623284 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.623311 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.625701 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fc9c79bdd-vqp6p"] Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.710909 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79f578d789-bbw9r"] Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.712917 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.717653 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.717917 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-94nrr" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.718031 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.718174 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.718278 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.721224 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742510 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-scripts\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-credential-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-fernet-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742649 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-config-data\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742682 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6gr\" (UniqueName: \"kubernetes.io/projected/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-kube-api-access-tl6gr\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-logs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742742 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-internal-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-config-data\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzzq\" (UniqueName: \"kubernetes.io/projected/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-kube-api-access-qwzzq\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742873 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-internal-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742903 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-combined-ca-bundle\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742919 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-public-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742941 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-scripts\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742966 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-combined-ca-bundle\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.742987 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-public-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.743891 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79f578d789-bbw9r"] Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845124 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzzq\" (UniqueName: \"kubernetes.io/projected/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-kube-api-access-qwzzq\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-internal-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-combined-ca-bundle\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845332 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-public-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845365 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-scripts\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845451 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-combined-ca-bundle\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845493 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-public-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845549 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-scripts\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845611 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-credential-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-fernet-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-config-data\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845757 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6gr\" (UniqueName: \"kubernetes.io/projected/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-kube-api-access-tl6gr\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-logs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-internal-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.845909 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-config-data\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.864930 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-combined-ca-bundle\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.865530 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-logs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.866950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-config-data\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.867447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-scripts\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.869200 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-public-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.869591 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-internal-tls-certs\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.869741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-combined-ca-bundle\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.877584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-internal-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.877638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-credential-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.878114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-config-data\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.878227 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-fernet-keys\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.878447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-public-tls-certs\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.883370 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzzq\" (UniqueName: \"kubernetes.io/projected/aceaf0a2-2b2b-4ef9-99d1-8bd21f553634-kube-api-access-qwzzq\") pod \"placement-7fc9c79bdd-vqp6p\" (UID: \"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634\") " pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.900353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-scripts\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.902514 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6gr\" (UniqueName: \"kubernetes.io/projected/858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99-kube-api-access-tl6gr\") pod \"keystone-79f578d789-bbw9r\" (UID: \"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99\") " pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:48 crc kubenswrapper[4922]: I0929 10:01:48.935736 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.038587 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerStarted","Data":"0fdf40ecda14fd6173dc71928769af8499e874d3aae60187838d27020b1575b4"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.065810 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.082701 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtwl5" event={"ID":"ed54c52a-229b-45f0-8526-19d6ca42237c","Type":"ContainerStarted","Data":"3fd6d839f8fafd5749397a685dc26ea36c366c517f58c90964d888ca0f8de4ca"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.102401 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerStarted","Data":"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.121111 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8b5fcf5f9-p74mm" event={"ID":"59b8f377-8449-49f6-992b-6b76ef613283","Type":"ContainerStarted","Data":"ad4556ba89b73616e51deb4598b165870f4ed3a12d15fa7c0b6a53712f46413d"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.121182 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8b5fcf5f9-p74mm" event={"ID":"59b8f377-8449-49f6-992b-6b76ef613283","Type":"ContainerStarted","Data":"e4096763f6080a340f7091cf965a915dec4c99b399517a08b8d126b52990add9"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.121193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8b5fcf5f9-p74mm" event={"ID":"59b8f377-8449-49f6-992b-6b76ef613283","Type":"ContainerStarted","Data":"dcac413eca1f520861d1ea52ce8bfca739d3f10af7149d9ad66d33ddbac19f41"} Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.124503 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.125457 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.131809 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dtwl5" podStartSLOduration=3.743697344 podStartE2EDuration="47.131783567s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="2025-09-29 10:01:04.208884734 +0000 UTC m=+989.575114998" lastFinishedPulling="2025-09-29 10:01:47.596970957 +0000 UTC m=+1032.963201221" observedRunningTime="2025-09-29 10:01:49.129800692 +0000 UTC m=+1034.496030956" watchObservedRunningTime="2025-09-29 10:01:49.131783567 +0000 UTC m=+1034.498013821" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.212085 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.212049013 podStartE2EDuration="10.212049013s" podCreationTimestamp="2025-09-29 10:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:49.183621322 +0000 UTC m=+1034.549851596" watchObservedRunningTime="2025-09-29 10:01:49.212049013 +0000 UTC m=+1034.578279277" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.281698 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8b5fcf5f9-p74mm" podStartSLOduration=7.281669841 podStartE2EDuration="7.281669841s" podCreationTimestamp="2025-09-29 10:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:49.219201587 +0000 UTC m=+1034.585431851" watchObservedRunningTime="2025-09-29 10:01:49.281669841 +0000 UTC m=+1034.647900095" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.615461 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.709784 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fc9c79bdd-vqp6p"] Sep 29 10:01:49 crc kubenswrapper[4922]: I0929 10:01:49.897201 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79f578d789-bbw9r"] Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.184136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc9c79bdd-vqp6p" event={"ID":"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634","Type":"ContainerStarted","Data":"5de617490046aba9a50b1641be53e6e0518cda23c41abe26540f038b03af9480"} Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.195275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79f578d789-bbw9r" event={"ID":"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99","Type":"ContainerStarted","Data":"92164f33c04f8a996576bedce2132fb2b1ebb672179547db23cdbcd23df001f6"} Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.375120 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.375813 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.426944 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:50 crc kubenswrapper[4922]: I0929 10:01:50.439817 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.218607 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lxww" event={"ID":"0c2d9bba-864b-468d-923e-23cf0544daf9","Type":"ContainerStarted","Data":"eef47573cc31b3b5a4cc7b1598a121fdb0fc95cdc2c8737b7e2da4edbb3fd519"} Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.227207 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79f578d789-bbw9r" event={"ID":"858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99","Type":"ContainerStarted","Data":"43b3d17bad9243e36309be79a0816f7bb7527c512c6b00d73f5561e202535629"} Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.227951 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.232918 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc9c79bdd-vqp6p" event={"ID":"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634","Type":"ContainerStarted","Data":"7a1307774c4f1c3e8699caff8612aebbf06ad1340178ceeb9e57aca2312c0479"} Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.232983 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc9c79bdd-vqp6p" event={"ID":"aceaf0a2-2b2b-4ef9-99d1-8bd21f553634","Type":"ContainerStarted","Data":"1c65928cd12744926f900a9d900effc9c05426af0a31103c1e993600b6626ed1"} Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.233176 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.233205 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.256701 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7lxww" podStartSLOduration=4.864450196 podStartE2EDuration="49.256679558s" podCreationTimestamp="2025-09-29 10:01:02 +0000 UTC" firstStartedPulling="2025-09-29 10:01:04.602398165 +0000 UTC m=+989.968628429" lastFinishedPulling="2025-09-29 10:01:48.994627527 +0000 UTC m=+1034.360857791" observedRunningTime="2025-09-29 10:01:51.246399979 +0000 UTC m=+1036.612630243" watchObservedRunningTime="2025-09-29 10:01:51.256679558 +0000 UTC m=+1036.622909822" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.271220 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79f578d789-bbw9r" podStartSLOduration=3.271191242 podStartE2EDuration="3.271191242s" podCreationTimestamp="2025-09-29 10:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:51.263758759 +0000 UTC m=+1036.629989023" watchObservedRunningTime="2025-09-29 10:01:51.271191242 +0000 UTC m=+1036.637421506" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.302943 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7fc9c79bdd-vqp6p" podStartSLOduration=3.302923561 podStartE2EDuration="3.302923561s" podCreationTimestamp="2025-09-29 10:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:01:51.297793583 +0000 UTC m=+1036.664023867" watchObservedRunningTime="2025-09-29 10:01:51.302923561 +0000 UTC m=+1036.669153825" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.765538 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 29 10:01:51 crc kubenswrapper[4922]: I0929 10:01:51.877885 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58b957f588-sp2bt" podUID="84f21d67-d595-4458-871c-e4bbc362b134" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Sep 29 10:01:52 crc kubenswrapper[4922]: I0929 10:01:52.251285 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed54c52a-229b-45f0-8526-19d6ca42237c" containerID="3fd6d839f8fafd5749397a685dc26ea36c366c517f58c90964d888ca0f8de4ca" exitCode=0 Sep 29 10:01:52 crc kubenswrapper[4922]: I0929 10:01:52.251437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtwl5" event={"ID":"ed54c52a-229b-45f0-8526-19d6ca42237c","Type":"ContainerDied","Data":"3fd6d839f8fafd5749397a685dc26ea36c366c517f58c90964d888ca0f8de4ca"} Sep 29 10:01:52 crc kubenswrapper[4922]: I0929 10:01:52.253341 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:52 crc kubenswrapper[4922]: I0929 10:01:52.253386 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:01:53 crc kubenswrapper[4922]: I0929 10:01:53.683383 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:55 crc kubenswrapper[4922]: I0929 10:01:55.357984 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:01:55 crc kubenswrapper[4922]: I0929 10:01:55.442347 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:01:55 crc kubenswrapper[4922]: I0929 10:01:55.442692 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="dnsmasq-dns" containerID="cri-o://d0a71cd1c28edd8cecf5b041c4bac087b227f47dfd5c2566f32df38691862579" gracePeriod=10 Sep 29 10:01:56 crc kubenswrapper[4922]: I0929 10:01:56.099909 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:01:56 crc kubenswrapper[4922]: I0929 10:01:56.311562 4922 generic.go:334] "Generic (PLEG): container finished" podID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerID="d0a71cd1c28edd8cecf5b041c4bac087b227f47dfd5c2566f32df38691862579" exitCode=0 Sep 29 10:01:56 crc kubenswrapper[4922]: I0929 10:01:56.312062 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" event={"ID":"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63","Type":"ContainerDied","Data":"d0a71cd1c28edd8cecf5b041c4bac087b227f47dfd5c2566f32df38691862579"} Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.339157 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtwl5" event={"ID":"ed54c52a-229b-45f0-8526-19d6ca42237c","Type":"ContainerDied","Data":"b2ab479284f23949e46567e4d9c2165fd21b2f696bc3f835a785432210be4fcf"} Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.340949 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ab479284f23949e46567e4d9c2165fd21b2f696bc3f835a785432210be4fcf" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.349820 4922 generic.go:334] "Generic (PLEG): container finished" podID="0c2d9bba-864b-468d-923e-23cf0544daf9" containerID="eef47573cc31b3b5a4cc7b1598a121fdb0fc95cdc2c8737b7e2da4edbb3fd519" exitCode=0 Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.350712 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lxww" event={"ID":"0c2d9bba-864b-468d-923e-23cf0544daf9","Type":"ContainerDied","Data":"eef47573cc31b3b5a4cc7b1598a121fdb0fc95cdc2c8737b7e2da4edbb3fd519"} Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.409260 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.508083 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfb9s\" (UniqueName: \"kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s\") pod \"ed54c52a-229b-45f0-8526-19d6ca42237c\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.508146 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data\") pod \"ed54c52a-229b-45f0-8526-19d6ca42237c\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.508179 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle\") pod \"ed54c52a-229b-45f0-8526-19d6ca42237c\" (UID: \"ed54c52a-229b-45f0-8526-19d6ca42237c\") " Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.517951 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s" (OuterVolumeSpecName: "kube-api-access-hfb9s") pod "ed54c52a-229b-45f0-8526-19d6ca42237c" (UID: "ed54c52a-229b-45f0-8526-19d6ca42237c"). InnerVolumeSpecName "kube-api-access-hfb9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.531597 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed54c52a-229b-45f0-8526-19d6ca42237c" (UID: "ed54c52a-229b-45f0-8526-19d6ca42237c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.552622 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed54c52a-229b-45f0-8526-19d6ca42237c" (UID: "ed54c52a-229b-45f0-8526-19d6ca42237c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.610960 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfb9s\" (UniqueName: \"kubernetes.io/projected/ed54c52a-229b-45f0-8526-19d6ca42237c-kube-api-access-hfb9s\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.611012 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:57 crc kubenswrapper[4922]: I0929 10:01:57.611025 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed54c52a-229b-45f0-8526-19d6ca42237c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.361708 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtwl5" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.741847 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-59bb77879f-bdcc9"] Sep 29 10:01:58 crc kubenswrapper[4922]: E0929 10:01:58.742318 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" containerName="barbican-db-sync" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.742330 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" containerName="barbican-db-sync" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.742558 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" containerName="barbican-db-sync" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.743597 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.746062 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.746079 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gzpqd" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.746488 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.764979 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59bb77879f-bdcc9"] Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.792849 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7cdd7877d-rvfhb"] Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.797789 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.800149 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.837934 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-combined-ca-bundle\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838050 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data-custom\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838148 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfkz6\" (UniqueName: \"kubernetes.io/projected/541f048f-4db6-45d6-aaa2-659dc9ff0b86-kube-api-access-bfkz6\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838200 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541f048f-4db6-45d6-aaa2-659dc9ff0b86-logs\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838237 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-logs\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838283 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data-custom\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838321 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wccfb\" (UniqueName: \"kubernetes.io/projected/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-kube-api-access-wccfb\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-combined-ca-bundle\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.838375 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.856352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cdd7877d-rvfhb"] Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.940529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541f048f-4db6-45d6-aaa2-659dc9ff0b86-logs\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.940598 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-logs\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.940629 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data-custom\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wccfb\" (UniqueName: \"kubernetes.io/projected/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-kube-api-access-wccfb\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941210 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-combined-ca-bundle\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941455 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-combined-ca-bundle\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.941607 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data-custom\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.942444 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfkz6\" (UniqueName: \"kubernetes.io/projected/541f048f-4db6-45d6-aaa2-659dc9ff0b86-kube-api-access-bfkz6\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.942649 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-logs\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.942993 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541f048f-4db6-45d6-aaa2-659dc9ff0b86-logs\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.945473 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.947617 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.955957 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data-custom\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.956634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-config-data\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.960405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541f048f-4db6-45d6-aaa2-659dc9ff0b86-combined-ca-bundle\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.965349 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-combined-ca-bundle\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.966125 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.966361 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-config-data-custom\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.970167 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfkz6\" (UniqueName: \"kubernetes.io/projected/541f048f-4db6-45d6-aaa2-659dc9ff0b86-kube-api-access-bfkz6\") pod \"barbican-keystone-listener-7cdd7877d-rvfhb\" (UID: \"541f048f-4db6-45d6-aaa2-659dc9ff0b86\") " pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.970678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wccfb\" (UniqueName: \"kubernetes.io/projected/f0d2cc2a-cdf2-490c-a56b-48977a5d83e0-kube-api-access-wccfb\") pod \"barbican-worker-59bb77879f-bdcc9\" (UID: \"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0\") " pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:58 crc kubenswrapper[4922]: I0929 10:01:58.977016 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.046676 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.047053 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.047135 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.047210 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jc2t\" (UniqueName: \"kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.047325 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.047400 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.070948 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.071026 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.089965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59bb77879f-bdcc9" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.092672 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.095200 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.097880 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.122782 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.148336 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.148444 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.148507 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.148530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.148556 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149068 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jc2t\" (UniqueName: \"kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149213 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149760 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149816 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149817 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.149887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.150211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4n4g\" (UniqueName: \"kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.150821 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.151081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.151329 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.172480 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jc2t\" (UniqueName: \"kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t\") pod \"dnsmasq-dns-75c8ddd69c-xqhhx\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.251874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4n4g\" (UniqueName: \"kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.252210 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.252354 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.252458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.252557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.252777 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.257971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.257973 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.259541 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.273411 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4n4g\" (UniqueName: \"kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g\") pod \"barbican-api-6df9f988f4-prgw8\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.404006 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.431802 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.634801 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.643710 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lxww" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.662775 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.662966 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663032 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663099 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663166 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663190 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663245 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663292 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf8qn\" (UniqueName: \"kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663453 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trx5g\" (UniqueName: \"kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g\") pod \"0c2d9bba-864b-468d-923e-23cf0544daf9\" (UID: \"0c2d9bba-864b-468d-923e-23cf0544daf9\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663608 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc\") pod \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\" (UID: \"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63\") " Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.663788 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.665486 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c2d9bba-864b-468d-923e-23cf0544daf9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.675080 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn" (OuterVolumeSpecName: "kube-api-access-cf8qn") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "kube-api-access-cf8qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.720142 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g" (OuterVolumeSpecName: "kube-api-access-trx5g") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "kube-api-access-trx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.721043 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts" (OuterVolumeSpecName: "scripts") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.731446 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.769164 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.769197 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf8qn\" (UniqueName: \"kubernetes.io/projected/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-kube-api-access-cf8qn\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.769209 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:01:59 crc kubenswrapper[4922]: I0929 10:01:59.769218 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trx5g\" (UniqueName: \"kubernetes.io/projected/0c2d9bba-864b-468d-923e-23cf0544daf9-kube-api-access-trx5g\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.005701 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.064487 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.066618 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config" (OuterVolumeSpecName: "config") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.081484 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.082124 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.082140 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.088284 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.098047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data" (OuterVolumeSpecName: "config-data") pod "0c2d9bba-864b-468d-923e-23cf0544daf9" (UID: "0c2d9bba-864b-468d-923e-23cf0544daf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.099259 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.108350 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" (UID: "ad38ae38-c12f-4a2c-8cf1-662c1cbecb63"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.184643 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2d9bba-864b-468d-923e-23cf0544daf9-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.184718 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.184732 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.184744 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:00 crc kubenswrapper[4922]: E0929 10:02:00.376883 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.390343 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" event={"ID":"ad38ae38-c12f-4a2c-8cf1-662c1cbecb63","Type":"ContainerDied","Data":"af56c2596d1c7f0d8a16d8fe5e4ca87d96084ddfa32d2e7ec8568f5a2d83515c"} Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.390625 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.390647 4922 scope.go:117] "RemoveContainer" containerID="d0a71cd1c28edd8cecf5b041c4bac087b227f47dfd5c2566f32df38691862579" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.396776 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerStarted","Data":"0020e00b680134b2edbe861505c866e985af47510c6ccc8cc6530f28849518ac"} Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.397177 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="ceilometer-notification-agent" containerID="cri-o://2161c77ca8997d8ca98f140447a3ddb270a71d4c1bbab0e07342af8a572264ad" gracePeriod=30 Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.397588 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.397701 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="proxy-httpd" containerID="cri-o://0020e00b680134b2edbe861505c866e985af47510c6ccc8cc6530f28849518ac" gracePeriod=30 Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.397784 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="sg-core" containerID="cri-o://0fdf40ecda14fd6173dc71928769af8499e874d3aae60187838d27020b1575b4" gracePeriod=30 Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.427808 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lxww" event={"ID":"0c2d9bba-864b-468d-923e-23cf0544daf9","Type":"ContainerDied","Data":"e2993ab9bd9464d336d99965fd23d36f29dfe04f8e7ff013f52028dff31fa57b"} Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.428051 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2993ab9bd9464d336d99965fd23d36f29dfe04f8e7ff013f52028dff31fa57b" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.428564 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lxww" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.469906 4922 scope.go:117] "RemoveContainer" containerID="ef4ca87457a9d1479e96a3f3df770f049dd1ba313ce9da1156241e355f90d3b0" Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.511951 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.530178 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-lrmb8"] Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.556999 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.631570 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59bb77879f-bdcc9"] Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.646008 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:02:00 crc kubenswrapper[4922]: W0929 10:02:00.775711 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod541f048f_4db6_45d6_aaa2_659dc9ff0b86.slice/crio-005db3ebdf2aa81bf3d7bb8e6c34a7f8c878ea34a4a2337006019625d58499f0 WatchSource:0}: Error finding container 005db3ebdf2aa81bf3d7bb8e6c34a7f8c878ea34a4a2337006019625d58499f0: Status 404 returned error can't find the container with id 005db3ebdf2aa81bf3d7bb8e6c34a7f8c878ea34a4a2337006019625d58499f0 Sep 29 10:02:00 crc kubenswrapper[4922]: I0929 10:02:00.776729 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cdd7877d-rvfhb"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.050996 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:01 crc kubenswrapper[4922]: E0929 10:02:01.052365 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="dnsmasq-dns" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.052390 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="dnsmasq-dns" Sep 29 10:02:01 crc kubenswrapper[4922]: E0929 10:02:01.052420 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="init" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.052426 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="init" Sep 29 10:02:01 crc kubenswrapper[4922]: E0929 10:02:01.052450 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" containerName="cinder-db-sync" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.052457 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" containerName="cinder-db-sync" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.052810 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" containerName="cinder-db-sync" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.052864 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="dnsmasq-dns" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.056411 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.071457 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-sj2gm" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.072091 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.072430 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.072529 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.108822 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114721 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114892 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.114970 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfcn\" (UniqueName: \"kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.155340 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.163015 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.165743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.208730 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.218998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219061 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219100 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219143 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfcn\" (UniqueName: \"kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmm6\" (UniqueName: \"kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219292 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219353 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219378 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.219593 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.239199 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.241989 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.245589 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfcn\" (UniqueName: \"kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.249545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.254703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts\") pod \"cinder-scheduler-0\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.258445 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.260616 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.265097 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.274719 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.321811 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlmh\" (UniqueName: \"kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322244 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322683 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322786 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.322917 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmm6\" (UniqueName: \"kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323176 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323249 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323324 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.323439 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.324521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.324888 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.325307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.327318 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.349634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmm6\" (UniqueName: \"kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6\") pod \"dnsmasq-dns-5784cf869f-4k9lh\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.428707 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.431784 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432032 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432312 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlmh\" (UniqueName: \"kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432597 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.432696 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.435013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.436395 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.437148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.442201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.444255 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.445931 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.449565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" event={"ID":"541f048f-4db6-45d6-aaa2-659dc9ff0b86","Type":"ContainerStarted","Data":"005db3ebdf2aa81bf3d7bb8e6c34a7f8c878ea34a4a2337006019625d58499f0"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.461550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlmh\" (UniqueName: \"kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh\") pod \"cinder-api-0\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.465712 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" path="/var/lib/kubelet/pods/ad38ae38-c12f-4a2c-8cf1-662c1cbecb63/volumes" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.466786 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.466823 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.466975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59bb77879f-bdcc9" event={"ID":"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0","Type":"ContainerStarted","Data":"6fb9bb93ade9265b6df8a0714fab0740e742f808a7dcae79a9bcf76342998788"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.467002 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerStarted","Data":"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.467012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerStarted","Data":"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.467023 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerStarted","Data":"0053879a327993f750ccae378afd886e94d262bf3ad45809edf8f272250dbeac"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.478495 4922 generic.go:334] "Generic (PLEG): container finished" podID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" containerID="fabe9839d1d2949965117180db8d3b9f8c77a73ad4c90c1758dc71230588d0a6" exitCode=0 Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.478590 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" event={"ID":"512fb117-f1c8-40f7-b6d0-e6d381f76fdc","Type":"ContainerDied","Data":"fabe9839d1d2949965117180db8d3b9f8c77a73ad4c90c1758dc71230588d0a6"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.478630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" event={"ID":"512fb117-f1c8-40f7-b6d0-e6d381f76fdc","Type":"ContainerStarted","Data":"3e79aa5fe5eb8bbee76fe0be517f900c6e1dab1086d618b4a046395d39e6f23a"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.486095 4922 generic.go:334] "Generic (PLEG): container finished" podID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerID="0fdf40ecda14fd6173dc71928769af8499e874d3aae60187838d27020b1575b4" exitCode=2 Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.486156 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerDied","Data":"0fdf40ecda14fd6173dc71928769af8499e874d3aae60187838d27020b1575b4"} Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.496727 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.497443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6df9f988f4-prgw8" podStartSLOduration=2.497418039 podStartE2EDuration="2.497418039s" podCreationTimestamp="2025-09-29 10:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:01.495113987 +0000 UTC m=+1046.861344261" watchObservedRunningTime="2025-09-29 10:02:01.497418039 +0000 UTC m=+1046.863648303" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.680020 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.763025 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 29 10:02:01 crc kubenswrapper[4922]: I0929 10:02:01.875865 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58b957f588-sp2bt" podUID="84f21d67-d595-4458-871c-e4bbc362b134" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.032031 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:02 crc kubenswrapper[4922]: E0929 10:02:02.037026 4922 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 29 10:02:02 crc kubenswrapper[4922]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/512fb117-f1c8-40f7-b6d0-e6d381f76fdc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:02:02 crc kubenswrapper[4922]: > podSandboxID="3e79aa5fe5eb8bbee76fe0be517f900c6e1dab1086d618b4a046395d39e6f23a" Sep 29 10:02:02 crc kubenswrapper[4922]: E0929 10:02:02.037352 4922 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 29 10:02:02 crc kubenswrapper[4922]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h86hd4h5f9hc8h599h5h56bh75h554h597h5f4hb7h98h58fh66ch57ch668h5bfhd8h596h68dh54h8ch674h587h5bdhb9hc4h695h5b8hccq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9jc2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75c8ddd69c-xqhhx_openstack(512fb117-f1c8-40f7-b6d0-e6d381f76fdc): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/512fb117-f1c8-40f7-b6d0-e6d381f76fdc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 10:02:02 crc kubenswrapper[4922]: > logger="UnhandledError" Sep 29 10:02:02 crc kubenswrapper[4922]: E0929 10:02:02.039187 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/512fb117-f1c8-40f7-b6d0-e6d381f76fdc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" podUID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.307927 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.496850 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerStarted","Data":"b82896e1ec6260ed8a2502130f3da26f82a13c328777d2e53576020c4583b8e9"} Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.504117 4922 generic.go:334] "Generic (PLEG): container finished" podID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerID="2161c77ca8997d8ca98f140447a3ddb270a71d4c1bbab0e07342af8a572264ad" exitCode=0 Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.504320 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerDied","Data":"2161c77ca8997d8ca98f140447a3ddb270a71d4c1bbab0e07342af8a572264ad"} Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.530293 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.936154 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:02:02 crc kubenswrapper[4922]: I0929 10:02:02.992749 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-lrmb8" podUID="ad38ae38-c12f-4a2c-8cf1-662c1cbecb63" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077018 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077339 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077449 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077500 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jc2t\" (UniqueName: \"kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.077563 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb\") pod \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\" (UID: \"512fb117-f1c8-40f7-b6d0-e6d381f76fdc\") " Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.090083 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t" (OuterVolumeSpecName: "kube-api-access-9jc2t") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "kube-api-access-9jc2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.145135 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.146345 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.149800 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config" (OuterVolumeSpecName: "config") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.151667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.163482 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "512fb117-f1c8-40f7-b6d0-e6d381f76fdc" (UID: "512fb117-f1c8-40f7-b6d0-e6d381f76fdc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181487 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181527 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181542 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181551 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181560 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jc2t\" (UniqueName: \"kubernetes.io/projected/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-kube-api-access-9jc2t\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.181568 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/512fb117-f1c8-40f7-b6d0-e6d381f76fdc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.519555 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" event={"ID":"512fb117-f1c8-40f7-b6d0-e6d381f76fdc","Type":"ContainerDied","Data":"3e79aa5fe5eb8bbee76fe0be517f900c6e1dab1086d618b4a046395d39e6f23a"} Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.519636 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-xqhhx" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.520009 4922 scope.go:117] "RemoveContainer" containerID="fabe9839d1d2949965117180db8d3b9f8c77a73ad4c90c1758dc71230588d0a6" Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.524417 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" event={"ID":"bfa5df04-6c6c-4b1b-868c-47daf84b7da2","Type":"ContainerStarted","Data":"b5be229f38dfb21368713c09261beafbb1c688cafcc23d61523f54f659609ae2"} Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.547880 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerStarted","Data":"aedec90cb5a073cfc81766b52b1cc94644548d4b5a40eb08c4e65bb8b704fc90"} Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.580535 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.591730 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-xqhhx"] Sep 29 10:02:03 crc kubenswrapper[4922]: I0929 10:02:03.848341 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.570807 4922 generic.go:334] "Generic (PLEG): container finished" podID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerID="f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db" exitCode=0 Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.571016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" event={"ID":"bfa5df04-6c6c-4b1b-868c-47daf84b7da2","Type":"ContainerDied","Data":"f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.577164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" event={"ID":"541f048f-4db6-45d6-aaa2-659dc9ff0b86","Type":"ContainerStarted","Data":"3183f4261a2c07ab3bd683d1b3c4b8e313b110c248a3af51f5e60db05d4ea9b2"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.577196 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" event={"ID":"541f048f-4db6-45d6-aaa2-659dc9ff0b86","Type":"ContainerStarted","Data":"30cdf747297f195e62d763031c461d55029c773b4b5751767b841f0dee8ae9b3"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.584368 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59bb77879f-bdcc9" event={"ID":"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0","Type":"ContainerStarted","Data":"8aee1a7b4f7d9c12ae436a87618c1e1cb16d68d731d5880129081a741c1a4a16"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.584402 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59bb77879f-bdcc9" event={"ID":"f0d2cc2a-cdf2-490c-a56b-48977a5d83e0","Type":"ContainerStarted","Data":"e26790b7623d4790ab32e49090039be0a9f2fcc19092378241355ae9291ea9c1"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.604163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerStarted","Data":"f86ac799058c85d721874cb6071b6305502575e71e0bf8e91516124dda81c1ef"} Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.630198 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-59bb77879f-bdcc9" podStartSLOduration=3.6771496729999997 podStartE2EDuration="6.63013767s" podCreationTimestamp="2025-09-29 10:01:58 +0000 UTC" firstStartedPulling="2025-09-29 10:02:00.64816311 +0000 UTC m=+1046.014393374" lastFinishedPulling="2025-09-29 10:02:03.601151107 +0000 UTC m=+1048.967381371" observedRunningTime="2025-09-29 10:02:04.620582111 +0000 UTC m=+1049.986812375" watchObservedRunningTime="2025-09-29 10:02:04.63013767 +0000 UTC m=+1049.996367934" Sep 29 10:02:04 crc kubenswrapper[4922]: I0929 10:02:04.649647 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7cdd7877d-rvfhb" podStartSLOduration=3.850007562 podStartE2EDuration="6.649616279s" podCreationTimestamp="2025-09-29 10:01:58 +0000 UTC" firstStartedPulling="2025-09-29 10:02:00.779609725 +0000 UTC m=+1046.145839989" lastFinishedPulling="2025-09-29 10:02:03.579218442 +0000 UTC m=+1048.945448706" observedRunningTime="2025-09-29 10:02:04.642125885 +0000 UTC m=+1050.008356149" watchObservedRunningTime="2025-09-29 10:02:04.649616279 +0000 UTC m=+1050.015846543" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.544652 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" path="/var/lib/kubelet/pods/512fb117-f1c8-40f7-b6d0-e6d381f76fdc/volumes" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.626649 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api-log" containerID="cri-o://f86ac799058c85d721874cb6071b6305502575e71e0bf8e91516124dda81c1ef" gracePeriod=30 Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.627052 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerStarted","Data":"235abe55a1217328740b9c8cfc50cff2627d21a4bb2f5fbf534e9b788fa43b8d"} Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.627573 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api" containerID="cri-o://235abe55a1217328740b9c8cfc50cff2627d21a4bb2f5fbf534e9b788fa43b8d" gracePeriod=30 Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.627899 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.643734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerStarted","Data":"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855"} Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.656593 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9c9d67f9d-vs7st"] Sep 29 10:02:05 crc kubenswrapper[4922]: E0929 10:02:05.657013 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" containerName="init" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.657032 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" containerName="init" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.657231 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fb117-f1c8-40f7-b6d0-e6d381f76fdc" containerName="init" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.666520 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.671772 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.672165 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" event={"ID":"bfa5df04-6c6c-4b1b-868c-47daf84b7da2","Type":"ContainerStarted","Data":"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742"} Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.672803 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.673351 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.702437 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.702404117 podStartE2EDuration="4.702404117s" podCreationTimestamp="2025-09-29 10:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:05.686930098 +0000 UTC m=+1051.053160362" watchObservedRunningTime="2025-09-29 10:02:05.702404117 +0000 UTC m=+1051.068634381" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.747576 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9c9d67f9d-vs7st"] Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.772118 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" podStartSLOduration=4.772081586 podStartE2EDuration="4.772081586s" podCreationTimestamp="2025-09-29 10:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:05.731440674 +0000 UTC m=+1051.097670938" watchObservedRunningTime="2025-09-29 10:02:05.772081586 +0000 UTC m=+1051.138311850" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.849760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-public-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.849859 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-kube-api-access-48f8b\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.850256 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-logs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.851032 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.851101 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-combined-ca-bundle\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.851363 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-internal-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.851442 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data-custom\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-logs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953749 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953778 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-combined-ca-bundle\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953852 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-internal-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953885 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data-custom\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-public-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.953933 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-kube-api-access-48f8b\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.954307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-logs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.981801 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-public-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.982361 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-internal-tls-certs\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.982679 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data-custom\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.983429 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-combined-ca-bundle\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.983633 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-config-data\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.988401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/c3007912-e64d-4325-beb8-fa3c2dfcbe5e-kube-api-access-48f8b\") pod \"barbican-api-9c9d67f9d-vs7st\" (UID: \"c3007912-e64d-4325-beb8-fa3c2dfcbe5e\") " pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:05 crc kubenswrapper[4922]: I0929 10:02:05.998574 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.539734 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9c9d67f9d-vs7st"] Sep 29 10:02:06 crc kubenswrapper[4922]: W0929 10:02:06.561878 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3007912_e64d_4325_beb8_fa3c2dfcbe5e.slice/crio-b9ccd74b440958bb3b2acca4fc85c8bc1c769bea22a33c3ae4c2e6a0ccf79791 WatchSource:0}: Error finding container b9ccd74b440958bb3b2acca4fc85c8bc1c769bea22a33c3ae4c2e6a0ccf79791: Status 404 returned error can't find the container with id b9ccd74b440958bb3b2acca4fc85c8bc1c769bea22a33c3ae4c2e6a0ccf79791 Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.689491 4922 generic.go:334] "Generic (PLEG): container finished" podID="757d0a3d-0977-4d21-b355-285ae41f1375" containerID="f86ac799058c85d721874cb6071b6305502575e71e0bf8e91516124dda81c1ef" exitCode=143 Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.689589 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerDied","Data":"f86ac799058c85d721874cb6071b6305502575e71e0bf8e91516124dda81c1ef"} Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.692659 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerStarted","Data":"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01"} Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.694406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c9d67f9d-vs7st" event={"ID":"c3007912-e64d-4325-beb8-fa3c2dfcbe5e","Type":"ContainerStarted","Data":"b9ccd74b440958bb3b2acca4fc85c8bc1c769bea22a33c3ae4c2e6a0ccf79791"} Sep 29 10:02:06 crc kubenswrapper[4922]: I0929 10:02:06.727694 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.965058572 podStartE2EDuration="6.72767178s" podCreationTimestamp="2025-09-29 10:02:00 +0000 UTC" firstStartedPulling="2025-09-29 10:02:02.087707426 +0000 UTC m=+1047.453937690" lastFinishedPulling="2025-09-29 10:02:03.850320634 +0000 UTC m=+1049.216550898" observedRunningTime="2025-09-29 10:02:06.716751833 +0000 UTC m=+1052.082982097" watchObservedRunningTime="2025-09-29 10:02:06.72767178 +0000 UTC m=+1052.093902034" Sep 29 10:02:07 crc kubenswrapper[4922]: I0929 10:02:07.715081 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c9d67f9d-vs7st" event={"ID":"c3007912-e64d-4325-beb8-fa3c2dfcbe5e","Type":"ContainerStarted","Data":"444957b6db38ecae9c7bd7eb23e1e96aaf8eb673136e6b92f27b6cd0732c4f42"} Sep 29 10:02:07 crc kubenswrapper[4922]: I0929 10:02:07.715706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c9d67f9d-vs7st" event={"ID":"c3007912-e64d-4325-beb8-fa3c2dfcbe5e","Type":"ContainerStarted","Data":"74d62ea7a5d41776d4e9252efc7332b107a1f89f71cb02313bd9d35434e8c7c6"} Sep 29 10:02:07 crc kubenswrapper[4922]: I0929 10:02:07.715734 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:07 crc kubenswrapper[4922]: I0929 10:02:07.715751 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:07 crc kubenswrapper[4922]: I0929 10:02:07.753312 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9c9d67f9d-vs7st" podStartSLOduration=2.753280271 podStartE2EDuration="2.753280271s" podCreationTimestamp="2025-09-29 10:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:07.746490968 +0000 UTC m=+1053.112721242" watchObservedRunningTime="2025-09-29 10:02:07.753280271 +0000 UTC m=+1053.119510535" Sep 29 10:02:10 crc kubenswrapper[4922]: I0929 10:02:10.393659 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:02:10 crc kubenswrapper[4922]: I0929 10:02:10.790744 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:10 crc kubenswrapper[4922]: I0929 10:02:10.990460 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.429174 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.500059 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.582620 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.582903 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="dnsmasq-dns" containerID="cri-o://6082321864e7c7057b85d91354c4b75d44321f80d7bcbd2172b58e7eb397997f" gracePeriod=10 Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.795404 4922 generic.go:334] "Generic (PLEG): container finished" podID="08defa73-766c-460e-97b9-ca7a1194230d" containerID="6082321864e7c7057b85d91354c4b75d44321f80d7bcbd2172b58e7eb397997f" exitCode=0 Sep 29 10:02:11 crc kubenswrapper[4922]: I0929 10:02:11.795482 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" event={"ID":"08defa73-766c-460e-97b9-ca7a1194230d","Type":"ContainerDied","Data":"6082321864e7c7057b85d91354c4b75d44321f80d7bcbd2172b58e7eb397997f"} Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.028119 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.164482 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.282659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443292 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh4jq\" (UniqueName: \"kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.443423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0\") pod \"08defa73-766c-460e-97b9-ca7a1194230d\" (UID: \"08defa73-766c-460e-97b9-ca7a1194230d\") " Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.461300 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq" (OuterVolumeSpecName: "kube-api-access-hh4jq") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "kube-api-access-hh4jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.535925 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.553289 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config" (OuterVolumeSpecName: "config") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.566557 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.584372 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.584410 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.584429 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh4jq\" (UniqueName: \"kubernetes.io/projected/08defa73-766c-460e-97b9-ca7a1194230d-kube-api-access-hh4jq\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.584440 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.602989 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.623465 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08defa73-766c-460e-97b9-ca7a1194230d" (UID: "08defa73-766c-460e-97b9-ca7a1194230d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.687179 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.687216 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08defa73-766c-460e-97b9-ca7a1194230d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.807868 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" event={"ID":"08defa73-766c-460e-97b9-ca7a1194230d","Type":"ContainerDied","Data":"e511a577f9169706cf94835042017b497d7f1d331618a985d2fbbc1fae27830e"} Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.808177 4922 scope.go:117] "RemoveContainer" containerID="6082321864e7c7057b85d91354c4b75d44321f80d7bcbd2172b58e7eb397997f" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.808064 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="cinder-scheduler" containerID="cri-o://86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855" gracePeriod=30 Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.807954 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-8ntc5" Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.808172 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="probe" containerID="cri-o://0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01" gracePeriod=30 Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.849450 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.861876 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-8ntc5"] Sep 29 10:02:12 crc kubenswrapper[4922]: I0929 10:02:12.872107 4922 scope.go:117] "RemoveContainer" containerID="ec693b0837d4065fa01a22507007b0483a3de82c7b5d8b40e0121758ecf79e5c" Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.058935 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8b5fcf5f9-p74mm" Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.226468 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.227635 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56694df76d-5npfb" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-httpd" containerID="cri-o://a1fbbafc63e8d3c19bafc6b82450832e0052d711c0d1d09ca40286e1866e0175" gracePeriod=30 Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.227260 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56694df76d-5npfb" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-api" containerID="cri-o://fe31755261ee819f5b1ed9789272133534028a97c2064da8e6ed0c950ba4e354" gracePeriod=30 Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.258939 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.464600 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08defa73-766c-460e-97b9-ca7a1194230d" path="/var/lib/kubelet/pods/08defa73-766c-460e-97b9-ca7a1194230d/volumes" Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.825992 4922 generic.go:334] "Generic (PLEG): container finished" podID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerID="a1fbbafc63e8d3c19bafc6b82450832e0052d711c0d1d09ca40286e1866e0175" exitCode=0 Sep 29 10:02:13 crc kubenswrapper[4922]: I0929 10:02:13.826082 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerDied","Data":"a1fbbafc63e8d3c19bafc6b82450832e0052d711c0d1d09ca40286e1866e0175"} Sep 29 10:02:14 crc kubenswrapper[4922]: I0929 10:02:14.306805 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:02:14 crc kubenswrapper[4922]: I0929 10:02:14.401017 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:02:14 crc kubenswrapper[4922]: I0929 10:02:14.850363 4922 generic.go:334] "Generic (PLEG): container finished" podID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerID="0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01" exitCode=0 Sep 29 10:02:14 crc kubenswrapper[4922]: I0929 10:02:14.850460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerDied","Data":"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01"} Sep 29 10:02:14 crc kubenswrapper[4922]: I0929 10:02:14.924031 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.164457 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9c9d67f9d-vs7st" Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.230140 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.230522 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df9f988f4-prgw8" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api" containerID="cri-o://2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b" gracePeriod=30 Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.230463 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df9f988f4-prgw8" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api-log" containerID="cri-o://5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43" gracePeriod=30 Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.884439 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerDied","Data":"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43"} Sep 29 10:02:15 crc kubenswrapper[4922]: I0929 10:02:15.885914 4922 generic.go:334] "Generic (PLEG): container finished" podID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerID="5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43" exitCode=143 Sep 29 10:02:16 crc kubenswrapper[4922]: I0929 10:02:16.371633 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:02:16 crc kubenswrapper[4922]: I0929 10:02:16.672522 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58b957f588-sp2bt" Sep 29 10:02:16 crc kubenswrapper[4922]: I0929 10:02:16.747264 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:02:16 crc kubenswrapper[4922]: I0929 10:02:16.901596 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" containerID="cri-o://279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66" gracePeriod=30 Sep 29 10:02:16 crc kubenswrapper[4922]: I0929 10:02:16.901562 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon-log" containerID="cri-o://0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31" gracePeriod=30 Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.639345 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.807348 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808012 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808052 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808109 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808218 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfcn\" (UniqueName: \"kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808285 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data\") pod \"97731a71-8a5a-489d-91db-e8b0bcb18f45\" (UID: \"97731a71-8a5a-489d-91db-e8b0bcb18f45\") " Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.808716 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97731a71-8a5a-489d-91db-e8b0bcb18f45-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.815868 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn" (OuterVolumeSpecName: "kube-api-access-jqfcn") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "kube-api-access-jqfcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.816242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts" (OuterVolumeSpecName: "scripts") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.817037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.875145 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.912590 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.912638 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.912652 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.912667 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfcn\" (UniqueName: \"kubernetes.io/projected/97731a71-8a5a-489d-91db-e8b0bcb18f45-kube-api-access-jqfcn\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.925657 4922 generic.go:334] "Generic (PLEG): container finished" podID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerID="86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855" exitCode=0 Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.925748 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerDied","Data":"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855"} Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.925767 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.925804 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97731a71-8a5a-489d-91db-e8b0bcb18f45","Type":"ContainerDied","Data":"b82896e1ec6260ed8a2502130f3da26f82a13c328777d2e53576020c4583b8e9"} Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.925851 4922 scope.go:117] "RemoveContainer" containerID="0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.932826 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data" (OuterVolumeSpecName: "config-data") pod "97731a71-8a5a-489d-91db-e8b0bcb18f45" (UID: "97731a71-8a5a-489d-91db-e8b0bcb18f45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.965139 4922 scope.go:117] "RemoveContainer" containerID="86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.988893 4922 scope.go:117] "RemoveContainer" containerID="0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01" Sep 29 10:02:17 crc kubenswrapper[4922]: E0929 10:02:17.989509 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01\": container with ID starting with 0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01 not found: ID does not exist" containerID="0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.989551 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01"} err="failed to get container status \"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01\": rpc error: code = NotFound desc = could not find container \"0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01\": container with ID starting with 0f4f2411a08cc0ea36a516349aaafee964125e234eadc678a1a289817f55fa01 not found: ID does not exist" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.989585 4922 scope.go:117] "RemoveContainer" containerID="86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855" Sep 29 10:02:17 crc kubenswrapper[4922]: E0929 10:02:17.990083 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855\": container with ID starting with 86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855 not found: ID does not exist" containerID="86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855" Sep 29 10:02:17 crc kubenswrapper[4922]: I0929 10:02:17.990115 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855"} err="failed to get container status \"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855\": rpc error: code = NotFound desc = could not find container \"86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855\": container with ID starting with 86df948f228ff4d41ad1ceb7974bdf9ea6bb5bd3ef4eaf62ee92d24eaccdb855 not found: ID does not exist" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.019716 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97731a71-8a5a-489d-91db-e8b0bcb18f45-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.312633 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.335266 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.346016 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:18 crc kubenswrapper[4922]: E0929 10:02:18.346553 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="init" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.346577 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="init" Sep 29 10:02:18 crc kubenswrapper[4922]: E0929 10:02:18.346596 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="dnsmasq-dns" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.346604 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="dnsmasq-dns" Sep 29 10:02:18 crc kubenswrapper[4922]: E0929 10:02:18.346624 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="probe" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.346633 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="probe" Sep 29 10:02:18 crc kubenswrapper[4922]: E0929 10:02:18.346645 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="cinder-scheduler" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.346652 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="cinder-scheduler" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.347441 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="probe" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.347469 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="08defa73-766c-460e-97b9-ca7a1194230d" containerName="dnsmasq-dns" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.347489 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" containerName="cinder-scheduler" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.348691 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.352900 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.374529 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.535852 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.536982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.537095 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hlcz\" (UniqueName: \"kubernetes.io/projected/a9172c63-e9e9-43c9-a804-72410f85eefe-kube-api-access-2hlcz\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.537257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.537611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.537724 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9172c63-e9e9-43c9-a804-72410f85eefe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639749 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639781 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hlcz\" (UniqueName: \"kubernetes.io/projected/a9172c63-e9e9-43c9-a804-72410f85eefe-kube-api-access-2hlcz\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.639993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9172c63-e9e9-43c9-a804-72410f85eefe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.640121 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9172c63-e9e9-43c9-a804-72410f85eefe-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.646381 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.648452 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.650540 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.653571 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9172c63-e9e9-43c9-a804-72410f85eefe-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.660396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hlcz\" (UniqueName: \"kubernetes.io/projected/a9172c63-e9e9-43c9-a804-72410f85eefe-kube-api-access-2hlcz\") pod \"cinder-scheduler-0\" (UID: \"a9172c63-e9e9-43c9-a804-72410f85eefe\") " pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.669772 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.917100 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.946864 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data\") pod \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.946938 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom\") pod \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.946972 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4n4g\" (UniqueName: \"kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g\") pod \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.947002 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs\") pod \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.947073 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle\") pod \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\" (UID: \"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22\") " Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.950601 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs" (OuterVolumeSpecName: "logs") pod "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" (UID: "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.955582 4922 generic.go:334] "Generic (PLEG): container finished" podID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerID="2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b" exitCode=0 Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.955667 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerDied","Data":"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b"} Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.955714 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df9f988f4-prgw8" event={"ID":"fff58aa5-ae6f-4ec5-b98f-1987bf57bd22","Type":"ContainerDied","Data":"0053879a327993f750ccae378afd886e94d262bf3ad45809edf8f272250dbeac"} Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.955741 4922 scope.go:117] "RemoveContainer" containerID="2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.955972 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df9f988f4-prgw8" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.956248 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" (UID: "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.956601 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g" (OuterVolumeSpecName: "kube-api-access-h4n4g") pod "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" (UID: "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22"). InnerVolumeSpecName "kube-api-access-h4n4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.967728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerDied","Data":"fe31755261ee819f5b1ed9789272133534028a97c2064da8e6ed0c950ba4e354"} Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.967584 4922 generic.go:334] "Generic (PLEG): container finished" podID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerID="fe31755261ee819f5b1ed9789272133534028a97c2064da8e6ed0c950ba4e354" exitCode=0 Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.980182 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" (UID: "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:18 crc kubenswrapper[4922]: I0929 10:02:18.984660 4922 scope.go:117] "RemoveContainer" containerID="5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.006395 4922 scope.go:117] "RemoveContainer" containerID="2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b" Sep 29 10:02:19 crc kubenswrapper[4922]: E0929 10:02:19.007051 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b\": container with ID starting with 2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b not found: ID does not exist" containerID="2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.007102 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b"} err="failed to get container status \"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b\": rpc error: code = NotFound desc = could not find container \"2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b\": container with ID starting with 2218f3fec5b20d7332e7445dfdb5aeeb2fc31fbc4aeb724ed2640bfaf2e13c2b not found: ID does not exist" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.007138 4922 scope.go:117] "RemoveContainer" containerID="5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43" Sep 29 10:02:19 crc kubenswrapper[4922]: E0929 10:02:19.007415 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43\": container with ID starting with 5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43 not found: ID does not exist" containerID="5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.007437 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43"} err="failed to get container status \"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43\": rpc error: code = NotFound desc = could not find container \"5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43\": container with ID starting with 5171718cb98ead8cd01e11d1fe8061af394495e756045e6b743957e4e1219c43 not found: ID does not exist" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.024787 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data" (OuterVolumeSpecName: "config-data") pod "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" (UID: "fff58aa5-ae6f-4ec5-b98f-1987bf57bd22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.054220 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4n4g\" (UniqueName: \"kubernetes.io/projected/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-kube-api-access-h4n4g\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.054255 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.054271 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.054285 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.054298 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.193970 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.311919 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.322310 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6df9f988f4-prgw8"] Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.361026 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqq8t\" (UniqueName: \"kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t\") pod \"9c70da77-2c81-490c-a16f-91680ebea9b5\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.361290 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs\") pod \"9c70da77-2c81-490c-a16f-91680ebea9b5\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.361432 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle\") pod \"9c70da77-2c81-490c-a16f-91680ebea9b5\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.361507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config\") pod \"9c70da77-2c81-490c-a16f-91680ebea9b5\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.361611 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config\") pod \"9c70da77-2c81-490c-a16f-91680ebea9b5\" (UID: \"9c70da77-2c81-490c-a16f-91680ebea9b5\") " Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.366269 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t" (OuterVolumeSpecName: "kube-api-access-rqq8t") pod "9c70da77-2c81-490c-a16f-91680ebea9b5" (UID: "9c70da77-2c81-490c-a16f-91680ebea9b5"). InnerVolumeSpecName "kube-api-access-rqq8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.366765 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9c70da77-2c81-490c-a16f-91680ebea9b5" (UID: "9c70da77-2c81-490c-a16f-91680ebea9b5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.432983 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.435988 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config" (OuterVolumeSpecName: "config") pod "9c70da77-2c81-490c-a16f-91680ebea9b5" (UID: "9c70da77-2c81-490c-a16f-91680ebea9b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: W0929 10:02:19.436432 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9172c63_e9e9_43c9_a804_72410f85eefe.slice/crio-030f8b4ef9d67441e565498e72da950121a2b83e6281aad62eb18156501dfcfa WatchSource:0}: Error finding container 030f8b4ef9d67441e565498e72da950121a2b83e6281aad62eb18156501dfcfa: Status 404 returned error can't find the container with id 030f8b4ef9d67441e565498e72da950121a2b83e6281aad62eb18156501dfcfa Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.436745 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c70da77-2c81-490c-a16f-91680ebea9b5" (UID: "9c70da77-2c81-490c-a16f-91680ebea9b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.465853 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.465899 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.465929 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.465942 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqq8t\" (UniqueName: \"kubernetes.io/projected/9c70da77-2c81-490c-a16f-91680ebea9b5-kube-api-access-rqq8t\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.469623 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97731a71-8a5a-489d-91db-e8b0bcb18f45" path="/var/lib/kubelet/pods/97731a71-8a5a-489d-91db-e8b0bcb18f45/volumes" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.471277 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" path="/var/lib/kubelet/pods/fff58aa5-ae6f-4ec5-b98f-1987bf57bd22/volumes" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.473110 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9c70da77-2c81-490c-a16f-91680ebea9b5" (UID: "9c70da77-2c81-490c-a16f-91680ebea9b5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.567192 4922 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c70da77-2c81-490c-a16f-91680ebea9b5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.977933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56694df76d-5npfb" event={"ID":"9c70da77-2c81-490c-a16f-91680ebea9b5","Type":"ContainerDied","Data":"4a38f2e4fe6dd34a94007a39ebf2a5c70c2b317089403cb81feed80b5b0b107b"} Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.977988 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56694df76d-5npfb" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.978377 4922 scope.go:117] "RemoveContainer" containerID="a1fbbafc63e8d3c19bafc6b82450832e0052d711c0d1d09ca40286e1866e0175" Sep 29 10:02:19 crc kubenswrapper[4922]: I0929 10:02:19.983400 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9172c63-e9e9-43c9-a804-72410f85eefe","Type":"ContainerStarted","Data":"030f8b4ef9d67441e565498e72da950121a2b83e6281aad62eb18156501dfcfa"} Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.017392 4922 scope.go:117] "RemoveContainer" containerID="fe31755261ee819f5b1ed9789272133534028a97c2064da8e6ed0c950ba4e354" Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.030109 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.038146 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56694df76d-5npfb"] Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.055996 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.193614 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fc9c79bdd-vqp6p" Sep 29 10:02:20 crc kubenswrapper[4922]: I0929 10:02:20.885904 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79f578d789-bbw9r" Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.008746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9172c63-e9e9-43c9-a804-72410f85eefe","Type":"ContainerStarted","Data":"b0599c2c9afc3ae0508fa987fd7491d6e84fc350818d6598aeeafb8084d0dbc7"} Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.008813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9172c63-e9e9-43c9-a804-72410f85eefe","Type":"ContainerStarted","Data":"c95f2a0d9ec0ea694890fbec76f8d7fea401d865f55c772f1a1f59dda4966967"} Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.012283 4922 generic.go:334] "Generic (PLEG): container finished" podID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerID="279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66" exitCode=0 Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.012367 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerDied","Data":"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66"} Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.035539 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.035518791 podStartE2EDuration="3.035518791s" podCreationTimestamp="2025-09-29 10:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:21.026926268 +0000 UTC m=+1066.393156532" watchObservedRunningTime="2025-09-29 10:02:21.035518791 +0000 UTC m=+1066.401749055" Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.468123 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" path="/var/lib/kubelet/pods/9c70da77-2c81-490c-a16f-91680ebea9b5/volumes" Sep 29 10:02:21 crc kubenswrapper[4922]: I0929 10:02:21.760221 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 29 10:02:23 crc kubenswrapper[4922]: I0929 10:02:23.670973 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.432127 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5f74f76895-9f28s"] Sep 29 10:02:24 crc kubenswrapper[4922]: E0929 10:02:24.433157 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433185 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api" Sep 29 10:02:24 crc kubenswrapper[4922]: E0929 10:02:24.433209 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api-log" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433218 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api-log" Sep 29 10:02:24 crc kubenswrapper[4922]: E0929 10:02:24.433237 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-api" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433249 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-api" Sep 29 10:02:24 crc kubenswrapper[4922]: E0929 10:02:24.433271 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-httpd" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433279 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-httpd" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433505 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-api" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433534 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433546 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c70da77-2c81-490c-a16f-91680ebea9b5" containerName="neutron-httpd" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.433565 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff58aa5-ae6f-4ec5-b98f-1987bf57bd22" containerName="barbican-api-log" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.434756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.437163 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.437517 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.445319 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.454751 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f74f76895-9f28s"] Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.497358 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-log-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.497435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-internal-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.497626 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-combined-ca-bundle\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.497821 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9z4h\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-kube-api-access-v9z4h\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.497935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-config-data\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.498006 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-run-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.498092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-public-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.498414 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-etc-swift\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600579 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-log-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-internal-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-combined-ca-bundle\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9z4h\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-kube-api-access-v9z4h\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600767 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-config-data\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600795 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-run-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600818 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-public-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.600892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-etc-swift\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.602185 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-run-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.602327 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b044ac1-a144-454a-a2f7-bf438ba13cc0-log-httpd\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.610734 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-combined-ca-bundle\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.611216 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-internal-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.620427 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-config-data\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.623165 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-etc-swift\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.636418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b044ac1-a144-454a-a2f7-bf438ba13cc0-public-tls-certs\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.652345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9z4h\" (UniqueName: \"kubernetes.io/projected/1b044ac1-a144-454a-a2f7-bf438ba13cc0-kube-api-access-v9z4h\") pod \"swift-proxy-5f74f76895-9f28s\" (UID: \"1b044ac1-a144-454a-a2f7-bf438ba13cc0\") " pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:24 crc kubenswrapper[4922]: I0929 10:02:24.772183 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.304140 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.318899 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.325390 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.326412 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z695t" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.326632 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.326849 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.425588 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f74f76895-9f28s"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.426372 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnngx\" (UniqueName: \"kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.426467 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.426490 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.426528 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.528561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.528743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnngx\" (UniqueName: \"kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.528858 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.528889 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.530353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.533605 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.535315 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.547425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnngx\" (UniqueName: \"kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx\") pod \"openstackclient\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.644071 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.729734 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.734712 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.745961 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.747480 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.757249 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.834330 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.834407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdst\" (UniqueName: \"kubernetes.io/projected/c6091d68-5a19-44af-8ffb-ec05b516a160-kube-api-access-hqdst\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.834455 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.834484 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.936760 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.936868 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdst\" (UniqueName: \"kubernetes.io/projected/c6091d68-5a19-44af-8ffb-ec05b516a160-kube-api-access-hqdst\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.936945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.936982 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.937923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.942482 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.942520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6091d68-5a19-44af-8ffb-ec05b516a160-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: I0929 10:02:25.961508 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdst\" (UniqueName: \"kubernetes.io/projected/c6091d68-5a19-44af-8ffb-ec05b516a160-kube-api-access-hqdst\") pod \"openstackclient\" (UID: \"c6091d68-5a19-44af-8ffb-ec05b516a160\") " pod="openstack/openstackclient" Sep 29 10:02:25 crc kubenswrapper[4922]: E0929 10:02:25.963381 4922 log.go:32] "RunPodSandbox from runtime service failed" err=< Sep 29 10:02:25 crc kubenswrapper[4922]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_64a14668-6282-4924-8311-10ca1411aeb2_0(7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe" Netns:"/var/run/netns/0fed3e99-a96d-465d-9604-ba6fe092297f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe;K8S_POD_UID=64a14668-6282-4924-8311-10ca1411aeb2" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/64a14668-6282-4924-8311-10ca1411aeb2:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe network default NAD default] [openstack/openstackclient 7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe network default NAD default] pod deleted before sandbox ADD operation began Sep 29 10:02:25 crc kubenswrapper[4922]: ' Sep 29 10:02:25 crc kubenswrapper[4922]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Sep 29 10:02:25 crc kubenswrapper[4922]: > Sep 29 10:02:25 crc kubenswrapper[4922]: E0929 10:02:25.963437 4922 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Sep 29 10:02:25 crc kubenswrapper[4922]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_64a14668-6282-4924-8311-10ca1411aeb2_0(7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe" Netns:"/var/run/netns/0fed3e99-a96d-465d-9604-ba6fe092297f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe;K8S_POD_UID=64a14668-6282-4924-8311-10ca1411aeb2" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/64a14668-6282-4924-8311-10ca1411aeb2:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe network default NAD default] [openstack/openstackclient 7e6cb71f02620cb78974dc2f7ca70563da5490764cb1cb3133ef85eda2997abe network default NAD default] pod deleted before sandbox ADD operation began Sep 29 10:02:25 crc kubenswrapper[4922]: ' Sep 29 10:02:25 crc kubenswrapper[4922]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Sep 29 10:02:25 crc kubenswrapper[4922]: > pod="openstack/openstackclient" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.084257 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.084529 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f74f76895-9f28s" event={"ID":"1b044ac1-a144-454a-a2f7-bf438ba13cc0","Type":"ContainerStarted","Data":"6cf87d6109cd33b48dd9923ee29df0ec6170ca727b4e0be3d0074befa26ffe2b"} Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.084605 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f74f76895-9f28s" event={"ID":"1b044ac1-a144-454a-a2f7-bf438ba13cc0","Type":"ContainerStarted","Data":"a9e7fd6e5811fe30813a2a413e3c77995d8aee8ec2e65aac99b5af2019d3bc45"} Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.084623 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f74f76895-9f28s" event={"ID":"1b044ac1-a144-454a-a2f7-bf438ba13cc0","Type":"ContainerStarted","Data":"43fe967e874bfafcbe3f4e9a348cada665e4b2c81c3d44f9af9335cb65652ab5"} Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.098068 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.123262 4922 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="64a14668-6282-4924-8311-10ca1411aeb2" podUID="c6091d68-5a19-44af-8ffb-ec05b516a160" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.130444 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5f74f76895-9f28s" podStartSLOduration=2.130395011 podStartE2EDuration="2.130395011s" podCreationTimestamp="2025-09-29 10:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:26.11562487 +0000 UTC m=+1071.481855154" watchObservedRunningTime="2025-09-29 10:02:26.130395011 +0000 UTC m=+1071.496625275" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.185347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.245676 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnngx\" (UniqueName: \"kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx\") pod \"64a14668-6282-4924-8311-10ca1411aeb2\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.246470 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle\") pod \"64a14668-6282-4924-8311-10ca1411aeb2\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.246512 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret\") pod \"64a14668-6282-4924-8311-10ca1411aeb2\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.246537 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config\") pod \"64a14668-6282-4924-8311-10ca1411aeb2\" (UID: \"64a14668-6282-4924-8311-10ca1411aeb2\") " Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.248072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "64a14668-6282-4924-8311-10ca1411aeb2" (UID: "64a14668-6282-4924-8311-10ca1411aeb2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.252276 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "64a14668-6282-4924-8311-10ca1411aeb2" (UID: "64a14668-6282-4924-8311-10ca1411aeb2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.252300 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64a14668-6282-4924-8311-10ca1411aeb2" (UID: "64a14668-6282-4924-8311-10ca1411aeb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.254666 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx" (OuterVolumeSpecName: "kube-api-access-tnngx") pod "64a14668-6282-4924-8311-10ca1411aeb2" (UID: "64a14668-6282-4924-8311-10ca1411aeb2"). InnerVolumeSpecName "kube-api-access-tnngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.349677 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnngx\" (UniqueName: \"kubernetes.io/projected/64a14668-6282-4924-8311-10ca1411aeb2-kube-api-access-tnngx\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.349735 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.349751 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.349765 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64a14668-6282-4924-8311-10ca1411aeb2-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:26 crc kubenswrapper[4922]: I0929 10:02:26.701714 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.096131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c6091d68-5a19-44af-8ffb-ec05b516a160","Type":"ContainerStarted","Data":"167e394d261d428c961187774a93b90162da647cd15bd06e7a834b1802961c2d"} Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.096145 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.096536 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.096571 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.115184 4922 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="64a14668-6282-4924-8311-10ca1411aeb2" podUID="c6091d68-5a19-44af-8ffb-ec05b516a160" Sep 29 10:02:27 crc kubenswrapper[4922]: I0929 10:02:27.468958 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a14668-6282-4924-8311-10ca1411aeb2" path="/var/lib/kubelet/pods/64a14668-6282-4924-8311-10ca1411aeb2/volumes" Sep 29 10:02:28 crc kubenswrapper[4922]: I0929 10:02:28.948746 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 10:02:29 crc kubenswrapper[4922]: I0929 10:02:29.071081 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:02:29 crc kubenswrapper[4922]: I0929 10:02:29.071168 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:02:31 crc kubenswrapper[4922]: I0929 10:02:31.139130 4922 generic.go:334] "Generic (PLEG): container finished" podID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerID="0020e00b680134b2edbe861505c866e985af47510c6ccc8cc6530f28849518ac" exitCode=137 Sep 29 10:02:31 crc kubenswrapper[4922]: I0929 10:02:31.139701 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerDied","Data":"0020e00b680134b2edbe861505c866e985af47510c6ccc8cc6530f28849518ac"} Sep 29 10:02:31 crc kubenswrapper[4922]: I0929 10:02:31.761075 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 29 10:02:33 crc kubenswrapper[4922]: I0929 10:02:33.299030 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.149:3000/\": dial tcp 10.217.0.149:3000: connect: connection refused" Sep 29 10:02:34 crc kubenswrapper[4922]: I0929 10:02:34.780458 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:34 crc kubenswrapper[4922]: I0929 10:02:34.791273 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f74f76895-9f28s" Sep 29 10:02:36 crc kubenswrapper[4922]: I0929 10:02:36.201282 4922 generic.go:334] "Generic (PLEG): container finished" podID="757d0a3d-0977-4d21-b355-285ae41f1375" containerID="235abe55a1217328740b9c8cfc50cff2627d21a4bb2f5fbf534e9b788fa43b8d" exitCode=137 Sep 29 10:02:36 crc kubenswrapper[4922]: I0929 10:02:36.201347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerDied","Data":"235abe55a1217328740b9c8cfc50cff2627d21a4bb2f5fbf534e9b788fa43b8d"} Sep 29 10:02:36 crc kubenswrapper[4922]: I0929 10:02:36.681935 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.168:8776/healthcheck\": dial tcp 10.217.0.168:8776: connect: connection refused" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.107786 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.203536 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8wns\" (UniqueName: \"kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.203640 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.203700 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.203920 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.203968 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.204066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.204121 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd\") pod \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\" (UID: \"b1089c4d-63a4-4d54-892c-d4c08291d4ec\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.205540 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.210742 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.213287 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns" (OuterVolumeSpecName: "kube-api-access-v8wns") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "kube-api-access-v8wns". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.217885 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts" (OuterVolumeSpecName: "scripts") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.222548 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.231420 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"757d0a3d-0977-4d21-b355-285ae41f1375","Type":"ContainerDied","Data":"aedec90cb5a073cfc81766b52b1cc94644548d4b5a40eb08c4e65bb8b704fc90"} Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.231485 4922 scope.go:117] "RemoveContainer" containerID="235abe55a1217328740b9c8cfc50cff2627d21a4bb2f5fbf534e9b788fa43b8d" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.231644 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.245444 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c6091d68-5a19-44af-8ffb-ec05b516a160","Type":"ContainerStarted","Data":"0e3d1aeb3c011ba7fe5324538b7526642f35aaad84af51739ca2e370e9049499"} Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.265357 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1089c4d-63a4-4d54-892c-d4c08291d4ec","Type":"ContainerDied","Data":"a4b3d9b268616bfaa41d1f154d60e82a648bd66021aacc32d99c3b23685a0b04"} Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.265530 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.288144 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.290956 4922 scope.go:117] "RemoveContainer" containerID="f86ac799058c85d721874cb6071b6305502575e71e0bf8e91516124dda81c1ef" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306013 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306056 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306154 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306210 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306298 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306518 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.306686 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlmh\" (UniqueName: \"kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh\") pod \"757d0a3d-0977-4d21-b355-285ae41f1375\" (UID: \"757d0a3d-0977-4d21-b355-285ae41f1375\") " Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.307307 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.307327 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1089c4d-63a4-4d54-892c-d4c08291d4ec-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.307337 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8wns\" (UniqueName: \"kubernetes.io/projected/b1089c4d-63a4-4d54-892c-d4c08291d4ec-kube-api-access-v8wns\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.307350 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.307360 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.308980 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.178457402 podStartE2EDuration="12.308965814s" podCreationTimestamp="2025-09-29 10:02:25 +0000 UTC" firstStartedPulling="2025-09-29 10:02:26.705847636 +0000 UTC m=+1072.072077900" lastFinishedPulling="2025-09-29 10:02:36.836356038 +0000 UTC m=+1082.202586312" observedRunningTime="2025-09-29 10:02:37.291579543 +0000 UTC m=+1082.657809807" watchObservedRunningTime="2025-09-29 10:02:37.308965814 +0000 UTC m=+1082.675196068" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.310286 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.311342 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs" (OuterVolumeSpecName: "logs") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.311404 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.320208 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh" (OuterVolumeSpecName: "kube-api-access-ndlmh") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "kube-api-access-ndlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.326001 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.330090 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts" (OuterVolumeSpecName: "scripts") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.344160 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.345078 4922 scope.go:117] "RemoveContainer" containerID="0020e00b680134b2edbe861505c866e985af47510c6ccc8cc6530f28849518ac" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.358442 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data" (OuterVolumeSpecName: "config-data") pod "b1089c4d-63a4-4d54-892c-d4c08291d4ec" (UID: "b1089c4d-63a4-4d54-892c-d4c08291d4ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.388315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data" (OuterVolumeSpecName: "config-data") pod "757d0a3d-0977-4d21-b355-285ae41f1375" (UID: "757d0a3d-0977-4d21-b355-285ae41f1375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.390723 4922 scope.go:117] "RemoveContainer" containerID="0fdf40ecda14fd6173dc71928769af8499e874d3aae60187838d27020b1575b4" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412735 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412800 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412810 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlmh\" (UniqueName: \"kubernetes.io/projected/757d0a3d-0977-4d21-b355-285ae41f1375-kube-api-access-ndlmh\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412823 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412851 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412867 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1089c4d-63a4-4d54-892c-d4c08291d4ec-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412876 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757d0a3d-0977-4d21-b355-285ae41f1375-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412887 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757d0a3d-0977-4d21-b355-285ae41f1375-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.412916 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/757d0a3d-0977-4d21-b355-285ae41f1375-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.414317 4922 scope.go:117] "RemoveContainer" containerID="2161c77ca8997d8ca98f140447a3ddb270a71d4c1bbab0e07342af8a572264ad" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.558864 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.572660 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583180 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: E0929 10:02:37.583749 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="ceilometer-notification-agent" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583779 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="ceilometer-notification-agent" Sep 29 10:02:37 crc kubenswrapper[4922]: E0929 10:02:37.583812 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="proxy-httpd" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583822 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="proxy-httpd" Sep 29 10:02:37 crc kubenswrapper[4922]: E0929 10:02:37.583885 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583896 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api" Sep 29 10:02:37 crc kubenswrapper[4922]: E0929 10:02:37.583914 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api-log" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583923 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api-log" Sep 29 10:02:37 crc kubenswrapper[4922]: E0929 10:02:37.583935 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="sg-core" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.583941 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="sg-core" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.584167 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="ceilometer-notification-agent" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.584195 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api-log" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.584205 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="proxy-httpd" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.584226 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" containerName="sg-core" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.584238 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" containerName="cinder-api" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.585652 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.589004 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.589472 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.589541 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.601907 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.632142 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.669890 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.687892 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.690128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.693943 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.694048 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719487 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34ed6be9-1694-4866-a437-36f08027b85f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719569 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-scripts\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719626 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ed6be9-1694-4866-a437-36f08027b85f-logs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data-custom\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719803 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719820 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719960 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chbd\" (UniqueName: \"kubernetes.io/projected/34ed6be9-1694-4866-a437-36f08027b85f-kube-api-access-9chbd\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.719989 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.728293 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-scripts\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822257 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ed6be9-1694-4866-a437-36f08027b85f-logs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822305 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822375 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data-custom\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822402 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.822730 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.838860 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ed6be9-1694-4866-a437-36f08027b85f-logs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.838938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4grj\" (UniqueName: \"kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839187 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839299 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839441 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9chbd\" (UniqueName: \"kubernetes.io/projected/34ed6be9-1694-4866-a437-36f08027b85f-kube-api-access-9chbd\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839563 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839623 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.839773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34ed6be9-1694-4866-a437-36f08027b85f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.840120 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34ed6be9-1694-4866-a437-36f08027b85f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.843919 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.844983 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.847249 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data-custom\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.852209 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-config-data\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.871069 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-scripts\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.885957 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ed6be9-1694-4866-a437-36f08027b85f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.887906 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chbd\" (UniqueName: \"kubernetes.io/projected/34ed6be9-1694-4866-a437-36f08027b85f-kube-api-access-9chbd\") pod \"cinder-api-0\" (UID: \"34ed6be9-1694-4866-a437-36f08027b85f\") " pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.908242 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945443 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4grj\" (UniqueName: \"kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945638 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.945773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.946944 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.950170 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.953629 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.958126 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.977594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.977722 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.981549 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dhmn8"] Sep 29 10:02:37 crc kubenswrapper[4922]: I0929 10:02:37.983843 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.024796 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dhmn8"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.026723 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4grj\" (UniqueName: \"kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj\") pod \"ceilometer-0\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " pod="openstack/ceilometer-0" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.051141 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwsp4\" (UniqueName: \"kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4\") pod \"nova-api-db-create-dhmn8\" (UID: \"3536ac4e-7447-4439-aa09-ef3fce28f84a\") " pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.058653 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bdl79"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.060883 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.100267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bdl79"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.117387 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.157681 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwsp4\" (UniqueName: \"kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4\") pod \"nova-api-db-create-dhmn8\" (UID: \"3536ac4e-7447-4439-aa09-ef3fce28f84a\") " pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.158388 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvstw\" (UniqueName: \"kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw\") pod \"nova-cell0-db-create-bdl79\" (UID: \"179180d9-2c92-4e80-bf4c-560bfe6e3a69\") " pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.179093 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwsp4\" (UniqueName: \"kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4\") pod \"nova-api-db-create-dhmn8\" (UID: \"3536ac4e-7447-4439-aa09-ef3fce28f84a\") " pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.250059 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dc5px"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.252985 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.260640 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dc5px"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.261480 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvstw\" (UniqueName: \"kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw\") pod \"nova-cell0-db-create-bdl79\" (UID: \"179180d9-2c92-4e80-bf4c-560bfe6e3a69\") " pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.289813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvstw\" (UniqueName: \"kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw\") pod \"nova-cell0-db-create-bdl79\" (UID: \"179180d9-2c92-4e80-bf4c-560bfe6e3a69\") " pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.366490 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfrm\" (UniqueName: \"kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm\") pod \"nova-cell1-db-create-dc5px\" (UID: \"bac65ee5-a195-4375-a997-c0f5cfea448e\") " pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.388108 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.421375 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.469458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfrm\" (UniqueName: \"kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm\") pod \"nova-cell1-db-create-dc5px\" (UID: \"bac65ee5-a195-4375-a997-c0f5cfea448e\") " pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.497067 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfrm\" (UniqueName: \"kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm\") pod \"nova-cell1-db-create-dc5px\" (UID: \"bac65ee5-a195-4375-a997-c0f5cfea448e\") " pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.617336 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.628864 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.736703 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:38 crc kubenswrapper[4922]: I0929 10:02:38.888416 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bdl79"] Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.028516 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dhmn8"] Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.345621 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dhmn8" event={"ID":"3536ac4e-7447-4439-aa09-ef3fce28f84a","Type":"ContainerStarted","Data":"eba09d2e9656f2264228c0c4ac06f5db57a494da0d667956471379f19e281f27"} Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.350093 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerStarted","Data":"fc0c3d455c295f7368b789e3b0f2ec0159f239ac7fa90c8cc8a697a2e00cfc16"} Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.353295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34ed6be9-1694-4866-a437-36f08027b85f","Type":"ContainerStarted","Data":"4d7ec98cfcc1eeb82e3d54a1c020079d1350b5c2c1a2060d370ab33dc18e7482"} Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.361628 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bdl79" event={"ID":"179180d9-2c92-4e80-bf4c-560bfe6e3a69","Type":"ContainerStarted","Data":"ffd8d520a5b98f7b72a061912ff7ffff9a7674078fd1e5e4f095a5e5fc15977d"} Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.378491 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dc5px"] Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.488544 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757d0a3d-0977-4d21-b355-285ae41f1375" path="/var/lib/kubelet/pods/757d0a3d-0977-4d21-b355-285ae41f1375/volumes" Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.490255 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1089c4d-63a4-4d54-892c-d4c08291d4ec" path="/var/lib/kubelet/pods/b1089c4d-63a4-4d54-892c-d4c08291d4ec/volumes" Sep 29 10:02:39 crc kubenswrapper[4922]: I0929 10:02:39.572553 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.405392 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerStarted","Data":"82b2e0f247a05bbf54288d98d2b691ef17b255235458acf0eb0b3a7124c6e741"} Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.416103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34ed6be9-1694-4866-a437-36f08027b85f","Type":"ContainerStarted","Data":"6341762a74fced93c44f8eeb809ebb0ead2b21de4162f1e82e521a81473e8502"} Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.430109 4922 generic.go:334] "Generic (PLEG): container finished" podID="bac65ee5-a195-4375-a997-c0f5cfea448e" containerID="2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63" exitCode=0 Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.430248 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dc5px" event={"ID":"bac65ee5-a195-4375-a997-c0f5cfea448e","Type":"ContainerDied","Data":"2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63"} Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.430725 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dc5px" event={"ID":"bac65ee5-a195-4375-a997-c0f5cfea448e","Type":"ContainerStarted","Data":"bcbcfbce22e0c6bfeaffa09e14ca3c0f50bfad218d475aa32f09b92ff698fa81"} Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.435054 4922 generic.go:334] "Generic (PLEG): container finished" podID="179180d9-2c92-4e80-bf4c-560bfe6e3a69" containerID="d7282c01419a74f27d0b9accaf14900951a579f6d8a96ffec4afce1fb373401d" exitCode=0 Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.435134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bdl79" event={"ID":"179180d9-2c92-4e80-bf4c-560bfe6e3a69","Type":"ContainerDied","Data":"d7282c01419a74f27d0b9accaf14900951a579f6d8a96ffec4afce1fb373401d"} Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.439937 4922 generic.go:334] "Generic (PLEG): container finished" podID="3536ac4e-7447-4439-aa09-ef3fce28f84a" containerID="5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4" exitCode=0 Sep 29 10:02:40 crc kubenswrapper[4922]: I0929 10:02:40.440051 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dhmn8" event={"ID":"3536ac4e-7447-4439-aa09-ef3fce28f84a","Type":"ContainerDied","Data":"5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4"} Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.485656 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34ed6be9-1694-4866-a437-36f08027b85f","Type":"ContainerStarted","Data":"b46b325a2afe70615883bda3217ae350248ad7ec5cc92cf5b432e8a41dc32126"} Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.486144 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.486162 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerStarted","Data":"6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950"} Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.490093 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.490061646 podStartE2EDuration="4.490061646s" podCreationTimestamp="2025-09-29 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:41.476743384 +0000 UTC m=+1086.842973648" watchObservedRunningTime="2025-09-29 10:02:41.490061646 +0000 UTC m=+1086.856291910" Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.761977 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fc765957b-xd4sr" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 29 10:02:41 crc kubenswrapper[4922]: I0929 10:02:41.762566 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.028119 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.092546 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwsp4\" (UniqueName: \"kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4\") pod \"3536ac4e-7447-4439-aa09-ef3fce28f84a\" (UID: \"3536ac4e-7447-4439-aa09-ef3fce28f84a\") " Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.105030 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4" (OuterVolumeSpecName: "kube-api-access-cwsp4") pod "3536ac4e-7447-4439-aa09-ef3fce28f84a" (UID: "3536ac4e-7447-4439-aa09-ef3fce28f84a"). InnerVolumeSpecName "kube-api-access-cwsp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.177595 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.183567 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.195086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfrm\" (UniqueName: \"kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm\") pod \"bac65ee5-a195-4375-a997-c0f5cfea448e\" (UID: \"bac65ee5-a195-4375-a997-c0f5cfea448e\") " Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.195226 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvstw\" (UniqueName: \"kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw\") pod \"179180d9-2c92-4e80-bf4c-560bfe6e3a69\" (UID: \"179180d9-2c92-4e80-bf4c-560bfe6e3a69\") " Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.196087 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwsp4\" (UniqueName: \"kubernetes.io/projected/3536ac4e-7447-4439-aa09-ef3fce28f84a-kube-api-access-cwsp4\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.205758 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm" (OuterVolumeSpecName: "kube-api-access-pkfrm") pod "bac65ee5-a195-4375-a997-c0f5cfea448e" (UID: "bac65ee5-a195-4375-a997-c0f5cfea448e"). InnerVolumeSpecName "kube-api-access-pkfrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.220500 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw" (OuterVolumeSpecName: "kube-api-access-gvstw") pod "179180d9-2c92-4e80-bf4c-560bfe6e3a69" (UID: "179180d9-2c92-4e80-bf4c-560bfe6e3a69"). InnerVolumeSpecName "kube-api-access-gvstw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.298845 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfrm\" (UniqueName: \"kubernetes.io/projected/bac65ee5-a195-4375-a997-c0f5cfea448e-kube-api-access-pkfrm\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.298890 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvstw\" (UniqueName: \"kubernetes.io/projected/179180d9-2c92-4e80-bf4c-560bfe6e3a69-kube-api-access-gvstw\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.486177 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bdl79" event={"ID":"179180d9-2c92-4e80-bf4c-560bfe6e3a69","Type":"ContainerDied","Data":"ffd8d520a5b98f7b72a061912ff7ffff9a7674078fd1e5e4f095a5e5fc15977d"} Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.486248 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd8d520a5b98f7b72a061912ff7ffff9a7674078fd1e5e4f095a5e5fc15977d" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.486203 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bdl79" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.488937 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dhmn8" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.488959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dhmn8" event={"ID":"3536ac4e-7447-4439-aa09-ef3fce28f84a","Type":"ContainerDied","Data":"eba09d2e9656f2264228c0c4ac06f5db57a494da0d667956471379f19e281f27"} Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.489021 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eba09d2e9656f2264228c0c4ac06f5db57a494da0d667956471379f19e281f27" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.492496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerStarted","Data":"3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3"} Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.495077 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dc5px" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.495069 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dc5px" event={"ID":"bac65ee5-a195-4375-a997-c0f5cfea448e","Type":"ContainerDied","Data":"bcbcfbce22e0c6bfeaffa09e14ca3c0f50bfad218d475aa32f09b92ff698fa81"} Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.495269 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbcfbce22e0c6bfeaffa09e14ca3c0f50bfad218d475aa32f09b92ff698fa81" Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.507049 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.507325 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-log" containerID="cri-o://c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694" gracePeriod=30 Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.508053 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-httpd" containerID="cri-o://92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0" gracePeriod=30 Sep 29 10:02:42 crc kubenswrapper[4922]: I0929 10:02:42.519407 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.157:9292/healthcheck\": EOF" Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.507751 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerStarted","Data":"8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21"} Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.508609 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.508186 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-central-agent" containerID="cri-o://82b2e0f247a05bbf54288d98d2b691ef17b255235458acf0eb0b3a7124c6e741" gracePeriod=30 Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.508732 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="proxy-httpd" containerID="cri-o://8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21" gracePeriod=30 Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.508817 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-notification-agent" containerID="cri-o://6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950" gracePeriod=30 Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.508885 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="sg-core" containerID="cri-o://3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3" gracePeriod=30 Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.524650 4922 generic.go:334] "Generic (PLEG): container finished" podID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerID="c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694" exitCode=143 Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.524701 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerDied","Data":"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694"} Sep 29 10:02:43 crc kubenswrapper[4922]: I0929 10:02:43.538327 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.263363251 podStartE2EDuration="6.538301118s" podCreationTimestamp="2025-09-29 10:02:37 +0000 UTC" firstStartedPulling="2025-09-29 10:02:38.851270117 +0000 UTC m=+1084.217500381" lastFinishedPulling="2025-09-29 10:02:43.126207984 +0000 UTC m=+1088.492438248" observedRunningTime="2025-09-29 10:02:43.53690074 +0000 UTC m=+1088.903131024" watchObservedRunningTime="2025-09-29 10:02:43.538301118 +0000 UTC m=+1088.904531382" Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.541176 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a92c760-c810-46be-aeea-f295975b8451" containerID="8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21" exitCode=0 Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.542285 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a92c760-c810-46be-aeea-f295975b8451" containerID="3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3" exitCode=2 Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.542389 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a92c760-c810-46be-aeea-f295975b8451" containerID="6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950" exitCode=0 Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.541270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerDied","Data":"8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21"} Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.542597 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerDied","Data":"3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3"} Sep 29 10:02:44 crc kubenswrapper[4922]: I0929 10:02:44.542691 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerDied","Data":"6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950"} Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.961447 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice/crio-conmon-5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice/crio-conmon-5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.962290 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice/crio-5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice/crio-5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.962330 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179180d9_2c92_4e80_bf4c_560bfe6e3a69.slice/crio-d7282c01419a74f27d0b9accaf14900951a579f6d8a96ffec4afce1fb373401d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179180d9_2c92_4e80_bf4c_560bfe6e3a69.slice/crio-d7282c01419a74f27d0b9accaf14900951a579f6d8a96ffec4afce1fb373401d.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.962354 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-bcbcfbce22e0c6bfeaffa09e14ca3c0f50bfad218d475aa32f09b92ff698fa81": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-bcbcfbce22e0c6bfeaffa09e14ca3c0f50bfad218d475aa32f09b92ff698fa81: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.965674 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-conmon-2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-conmon-2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.965937 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice/crio-2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.968147 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.968209 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.968233 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3.scope: no such file or directory Sep 29 10:02:46 crc kubenswrapper[4922]: W0929 10:02:46.968259 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3.scope: no such file or directory Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.387223 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419266 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419385 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419492 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419583 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419629 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jx6\" (UniqueName: \"kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419663 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.419701 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key\") pod \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\" (UID: \"c63e97c2-45d9-4b32-9b0e-1449fad249e6\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.422753 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs" (OuterVolumeSpecName: "logs") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.431228 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6" (OuterVolumeSpecName: "kube-api-access-w7jx6") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "kube-api-access-w7jx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.479471 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.498032 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.508257 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data" (OuterVolumeSpecName: "config-data") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.521408 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jx6\" (UniqueName: \"kubernetes.io/projected/c63e97c2-45d9-4b32-9b0e-1449fad249e6-kube-api-access-w7jx6\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.521458 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c63e97c2-45d9-4b32-9b0e-1449fad249e6-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.521472 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.521483 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.521493 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.539225 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.544526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts" (OuterVolumeSpecName: "scripts") pod "c63e97c2-45d9-4b32-9b0e-1449fad249e6" (UID: "c63e97c2-45d9-4b32-9b0e-1449fad249e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.568692 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.578774 4922 generic.go:334] "Generic (PLEG): container finished" podID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerID="92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0" exitCode=0 Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.578846 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.578875 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerDied","Data":"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0"} Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.579398 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c23d3429-fb5e-4829-9c7b-65f6104fe30c","Type":"ContainerDied","Data":"fba3fec3edcff73a297bbd9b9322e0cd187f6d7a1c7e19c2e62580935c4f9eaf"} Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.579426 4922 scope.go:117] "RemoveContainer" containerID="92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.583168 4922 generic.go:334] "Generic (PLEG): container finished" podID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerID="0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31" exitCode=137 Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.583212 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc765957b-xd4sr" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.583233 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerDied","Data":"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31"} Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.583278 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc765957b-xd4sr" event={"ID":"c63e97c2-45d9-4b32-9b0e-1449fad249e6","Type":"ContainerDied","Data":"de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394"} Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.622116 4922 scope.go:117] "RemoveContainer" containerID="c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.625736 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c63e97c2-45d9-4b32-9b0e-1449fad249e6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.625775 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63e97c2-45d9-4b32-9b0e-1449fad249e6-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.653644 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.659248 4922 scope.go:117] "RemoveContainer" containerID="92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0" Sep 29 10:02:47 crc kubenswrapper[4922]: E0929 10:02:47.659948 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0\": container with ID starting with 92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0 not found: ID does not exist" containerID="92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.660007 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0"} err="failed to get container status \"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0\": rpc error: code = NotFound desc = could not find container \"92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0\": container with ID starting with 92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0 not found: ID does not exist" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.660045 4922 scope.go:117] "RemoveContainer" containerID="c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694" Sep 29 10:02:47 crc kubenswrapper[4922]: E0929 10:02:47.660670 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694\": container with ID starting with c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694 not found: ID does not exist" containerID="c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.660712 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694"} err="failed to get container status \"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694\": rpc error: code = NotFound desc = could not find container \"c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694\": container with ID starting with c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694 not found: ID does not exist" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.660744 4922 scope.go:117] "RemoveContainer" containerID="279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.671077 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fc765957b-xd4sr"] Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.727591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.727762 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.727864 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.727970 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728025 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728106 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728161 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tjtq\" (UniqueName: \"kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle\") pod \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\" (UID: \"c23d3429-fb5e-4829-9c7b-65f6104fe30c\") " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728587 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.728885 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.729068 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs" (OuterVolumeSpecName: "logs") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.734672 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.736341 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts" (OuterVolumeSpecName: "scripts") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.743699 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq" (OuterVolumeSpecName: "kube-api-access-6tjtq") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "kube-api-access-6tjtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.795799 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.830721 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data" (OuterVolumeSpecName: "config-data") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.833979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c23d3429-fb5e-4829-9c7b-65f6104fe30c" (UID: "c23d3429-fb5e-4829-9c7b-65f6104fe30c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.834899 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c23d3429-fb5e-4829-9c7b-65f6104fe30c-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.834953 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.834971 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tjtq\" (UniqueName: \"kubernetes.io/projected/c23d3429-fb5e-4829-9c7b-65f6104fe30c-kube-api-access-6tjtq\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.834985 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.835000 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.835054 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.835072 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23d3429-fb5e-4829-9c7b-65f6104fe30c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.845405 4922 scope.go:117] "RemoveContainer" containerID="0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.909690 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.922418 4922 scope.go:117] "RemoveContainer" containerID="279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66" Sep 29 10:02:47 crc kubenswrapper[4922]: E0929 10:02:47.924349 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66\": container with ID starting with 279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66 not found: ID does not exist" containerID="279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.924406 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66"} err="failed to get container status \"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66\": rpc error: code = NotFound desc = could not find container \"279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66\": container with ID starting with 279290ec3b7510adb8f68066c9a75b7ee5fb1d32e53424d5c435145cf88cdc66 not found: ID does not exist" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.931673 4922 scope.go:117] "RemoveContainer" containerID="0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31" Sep 29 10:02:47 crc kubenswrapper[4922]: E0929 10:02:47.938010 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31\": container with ID starting with 0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31 not found: ID does not exist" containerID="0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.938076 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31"} err="failed to get container status \"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31\": rpc error: code = NotFound desc = could not find container \"0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31\": container with ID starting with 0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31 not found: ID does not exist" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.941438 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.943767 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:47 crc kubenswrapper[4922]: I0929 10:02:47.968703 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.034351 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035267 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-log" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035288 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-log" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035306 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3536ac4e-7447-4439-aa09-ef3fce28f84a" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035312 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3536ac4e-7447-4439-aa09-ef3fce28f84a" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035334 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon-log" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035341 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon-log" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035354 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-httpd" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035360 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-httpd" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035370 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179180d9-2c92-4e80-bf4c-560bfe6e3a69" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035376 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="179180d9-2c92-4e80-bf4c-560bfe6e3a69" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035384 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac65ee5-a195-4375-a997-c0f5cfea448e" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035390 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac65ee5-a195-4375-a997-c0f5cfea448e" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.035420 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035427 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035637 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3536ac4e-7447-4439-aa09-ef3fce28f84a" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035651 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="179180d9-2c92-4e80-bf4c-560bfe6e3a69" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035665 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-log" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035676 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac65ee5-a195-4375-a997-c0f5cfea448e" containerName="mariadb-database-create" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035691 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" containerName="glance-httpd" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035699 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon-log" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.035976 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" containerName="horizon" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.039257 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.043623 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.044260 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.059572 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:48 crc kubenswrapper[4922]: E0929 10:02:48.116449 4922 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/901e3bbd3a3d0fbb20f16856e34519243706f6e4ab5b32d4b6c3a1106fe4d1ee/diff" to get inode usage: stat /var/lib/containers/storage/overlay/901e3bbd3a3d0fbb20f16856e34519243706f6e4ab5b32d4b6c3a1106fe4d1ee/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_c23d3429-fb5e-4829-9c7b-65f6104fe30c/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_c23d3429-fb5e-4829-9c7b-65f6104fe30c/glance-httpd/0.log: no such file or directory Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145742 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145860 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145884 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.145994 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.146031 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.146066 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctxt\" (UniqueName: \"kubernetes.io/projected/5ec5073f-9a07-4292-8cfb-62e419a0438d-kube-api-access-4ctxt\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.189337 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2ee9-account-create-g6ttg"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.190752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.194348 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.223297 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2ee9-account-create-g6ttg"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.249903 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctxt\" (UniqueName: \"kubernetes.io/projected/5ec5073f-9a07-4292-8cfb-62e419a0438d-kube-api-access-4ctxt\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250076 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250293 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250368 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250576 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.250674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.251276 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.260084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.260610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec5073f-9a07-4292-8cfb-62e419a0438d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.263980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.272645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.273615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.288225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec5073f-9a07-4292-8cfb-62e419a0438d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.296678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctxt\" (UniqueName: \"kubernetes.io/projected/5ec5073f-9a07-4292-8cfb-62e419a0438d-kube-api-access-4ctxt\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.314241 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ec5073f-9a07-4292-8cfb-62e419a0438d\") " pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.352514 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7g7j\" (UniqueName: \"kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j\") pod \"nova-api-2ee9-account-create-g6ttg\" (UID: \"146980d0-cc5b-44ad-87c7-fd6463f25659\") " pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.389630 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.420800 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b33d-account-create-7wvb5"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.422548 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.431448 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.455516 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7g7j\" (UniqueName: \"kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j\") pod \"nova-api-2ee9-account-create-g6ttg\" (UID: \"146980d0-cc5b-44ad-87c7-fd6463f25659\") " pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.465142 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b33d-account-create-7wvb5"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.512069 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7g7j\" (UniqueName: \"kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j\") pod \"nova-api-2ee9-account-create-g6ttg\" (UID: \"146980d0-cc5b-44ad-87c7-fd6463f25659\") " pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.527733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.566347 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrnl\" (UniqueName: \"kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl\") pod \"nova-cell0-b33d-account-create-7wvb5\" (UID: \"d8e9ee79-2a19-4e08-9e57-a1d745c7976e\") " pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.664158 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0045-account-create-p2pzv"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.666012 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.672121 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrnl\" (UniqueName: \"kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl\") pod \"nova-cell0-b33d-account-create-7wvb5\" (UID: \"d8e9ee79-2a19-4e08-9e57-a1d745c7976e\") " pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.698999 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.724869 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrnl\" (UniqueName: \"kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl\") pod \"nova-cell0-b33d-account-create-7wvb5\" (UID: \"d8e9ee79-2a19-4e08-9e57-a1d745c7976e\") " pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.725654 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0045-account-create-p2pzv"] Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.776675 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5gw\" (UniqueName: \"kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw\") pod \"nova-cell1-0045-account-create-p2pzv\" (UID: \"77c731a0-29c8-476c-98a0-3bf96579183f\") " pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.870693 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.879425 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5gw\" (UniqueName: \"kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw\") pod \"nova-cell1-0045-account-create-p2pzv\" (UID: \"77c731a0-29c8-476c-98a0-3bf96579183f\") " pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:48 crc kubenswrapper[4922]: I0929 10:02:48.904338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5gw\" (UniqueName: \"kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw\") pod \"nova-cell1-0045-account-create-p2pzv\" (UID: \"77c731a0-29c8-476c-98a0-3bf96579183f\") " pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.030125 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.267617 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.380172 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b33d-account-create-7wvb5"] Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.398735 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2ee9-account-create-g6ttg"] Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.465299 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23d3429-fb5e-4829-9c7b-65f6104fe30c" path="/var/lib/kubelet/pods/c23d3429-fb5e-4829-9c7b-65f6104fe30c/volumes" Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.466451 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63e97c2-45d9-4b32-9b0e-1449fad249e6" path="/var/lib/kubelet/pods/c63e97c2-45d9-4b32-9b0e-1449fad249e6/volumes" Sep 29 10:02:49 crc kubenswrapper[4922]: W0929 10:02:49.521385 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-conmon-8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21.scope: no such file or directory Sep 29 10:02:49 crc kubenswrapper[4922]: W0929 10:02:49.521455 4922 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a92c760_c810_46be_aeea_f295975b8451.slice/crio-8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21.scope: no such file or directory Sep 29 10:02:49 crc kubenswrapper[4922]: W0929 10:02:49.722232 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77c731a0_29c8_476c_98a0_3bf96579183f.slice/crio-7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b WatchSource:0}: Error finding container 7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b: Status 404 returned error can't find the container with id 7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.748635 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0045-account-create-p2pzv"] Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.768633 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ee9-account-create-g6ttg" event={"ID":"146980d0-cc5b-44ad-87c7-fd6463f25659","Type":"ContainerStarted","Data":"a7e41743c6d5f022fcce9e2cf33b5b754f3184b6b9dec749954be51865198c9e"} Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.782379 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b33d-account-create-7wvb5" event={"ID":"d8e9ee79-2a19-4e08-9e57-a1d745c7976e","Type":"ContainerStarted","Data":"b733d789a41b14fa487c4557612684865cdc99ab3bab872b95a4f59216bfc2dd"} Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.788404 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a92c760-c810-46be-aeea-f295975b8451" containerID="82b2e0f247a05bbf54288d98d2b691ef17b255235458acf0eb0b3a7124c6e741" exitCode=0 Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.788510 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerDied","Data":"82b2e0f247a05bbf54288d98d2b691ef17b255235458acf0eb0b3a7124c6e741"} Sep 29 10:02:49 crc kubenswrapper[4922]: I0929 10:02:49.790589 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec5073f-9a07-4292-8cfb-62e419a0438d","Type":"ContainerStarted","Data":"5bae56f8d01a7eef41b961efacf5762aaed834f4cfcb4600638283a915ef712e"} Sep 29 10:02:49 crc kubenswrapper[4922]: E0929 10:02:49.929335 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179180d9_2c92_4e80_bf4c_560bfe6e3a69.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac65ee5_a195_4375_a997_c0f5cfea448e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63e97c2_45d9_4b32_9b0e_1449fad249e6.slice/crio-conmon-0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63e97c2_45d9_4b32_9b0e_1449fad249e6.slice/crio-0a82a7a376667f2fcc0422e4c0d01d5806a33d2f3c39d2fe9a0e39a74ca37d31.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63e97c2_45d9_4b32_9b0e_1449fad249e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc63e97c2_45d9_4b32_9b0e_1449fad249e6.slice/crio-de7a34dec5357e85afd4f33949f846472ed92f73a1af820e606903b1a8d0f394\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice/crio-fba3fec3edcff73a297bbd9b9322e0cd187f6d7a1c7e19c2e62580935c4f9eaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179180d9_2c92_4e80_bf4c_560bfe6e3a69.slice/crio-ffd8d520a5b98f7b72a061912ff7ffff9a7674078fd1e5e4f095a5e5fc15977d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice/crio-92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice/crio-conmon-c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice/crio-c40eba15627fe7910ba7c96b9bf70a21a51a72d6490b17792f0850a82bd73694.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice/crio-conmon-92f9b67ee19e58ebc85098af1618543496d81869125a65bc5797de26f4a4ede0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23d3429_fb5e_4829_9c7b_65f6104fe30c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3536ac4e_7447_4439_aa09_ef3fce28f84a.slice/crio-eba09d2e9656f2264228c0c4ac06f5db57a494da0d667956471379f19e281f27\": RecentStats: unable to find data in memory cache]" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.270800 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439271 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439526 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439669 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439698 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4grj\" (UniqueName: \"kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439735 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.439853 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts\") pod \"8a92c760-c810-46be-aeea-f295975b8451\" (UID: \"8a92c760-c810-46be-aeea-f295975b8451\") " Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.441956 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.442298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.467346 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts" (OuterVolumeSpecName: "scripts") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.468255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj" (OuterVolumeSpecName: "kube-api-access-x4grj") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "kube-api-access-x4grj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.518589 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.544224 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.544317 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4grj\" (UniqueName: \"kubernetes.io/projected/8a92c760-c810-46be-aeea-f295975b8451-kube-api-access-x4grj\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.544356 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.544374 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a92c760-c810-46be-aeea-f295975b8451-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.544388 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.557028 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.634858 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data" (OuterVolumeSpecName: "config-data") pod "8a92c760-c810-46be-aeea-f295975b8451" (UID: "8a92c760-c810-46be-aeea-f295975b8451"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.651778 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.653597 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a92c760-c810-46be-aeea-f295975b8451-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.815932 4922 generic.go:334] "Generic (PLEG): container finished" podID="146980d0-cc5b-44ad-87c7-fd6463f25659" containerID="8832034619e8459a41e5045254a8e0b2567e755ef13b91f024e859cb9fd0c84e" exitCode=0 Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.816032 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ee9-account-create-g6ttg" event={"ID":"146980d0-cc5b-44ad-87c7-fd6463f25659","Type":"ContainerDied","Data":"8832034619e8459a41e5045254a8e0b2567e755ef13b91f024e859cb9fd0c84e"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.825593 4922 generic.go:334] "Generic (PLEG): container finished" podID="d8e9ee79-2a19-4e08-9e57-a1d745c7976e" containerID="bce7dc1844ea1b95e0b5e90c68f9469df1ab8d2094dbe5818fad88cae3d68079" exitCode=0 Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.826120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b33d-account-create-7wvb5" event={"ID":"d8e9ee79-2a19-4e08-9e57-a1d745c7976e","Type":"ContainerDied","Data":"bce7dc1844ea1b95e0b5e90c68f9469df1ab8d2094dbe5818fad88cae3d68079"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.848949 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a92c760-c810-46be-aeea-f295975b8451","Type":"ContainerDied","Data":"fc0c3d455c295f7368b789e3b0f2ec0159f239ac7fa90c8cc8a697a2e00cfc16"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.849337 4922 scope.go:117] "RemoveContainer" containerID="8429ce5a84aa82bf600d193267f99157d69f4a6f313b870783bd40192ae4eb21" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.849617 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.867687 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec5073f-9a07-4292-8cfb-62e419a0438d","Type":"ContainerStarted","Data":"a91a05409d3c40b754310fa19825f289cbca202c2083eb843af878c8463b144f"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.889133 4922 generic.go:334] "Generic (PLEG): container finished" podID="77c731a0-29c8-476c-98a0-3bf96579183f" containerID="0f7f359fbce576e1fdb0dec2904402546251b18c10b6d1b01d139a3a7832b46f" exitCode=0 Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.889483 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0045-account-create-p2pzv" event={"ID":"77c731a0-29c8-476c-98a0-3bf96579183f","Type":"ContainerDied","Data":"0f7f359fbce576e1fdb0dec2904402546251b18c10b6d1b01d139a3a7832b46f"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.889560 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0045-account-create-p2pzv" event={"ID":"77c731a0-29c8-476c-98a0-3bf96579183f","Type":"ContainerStarted","Data":"7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b"} Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.896297 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.915742 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.934882 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:50 crc kubenswrapper[4922]: E0929 10:02:50.935416 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="proxy-httpd" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935433 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="proxy-httpd" Sep 29 10:02:50 crc kubenswrapper[4922]: E0929 10:02:50.935443 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="sg-core" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935452 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="sg-core" Sep 29 10:02:50 crc kubenswrapper[4922]: E0929 10:02:50.935485 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-central-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935492 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-central-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: E0929 10:02:50.935514 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-notification-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935521 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-notification-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935740 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="sg-core" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935755 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-central-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935764 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="proxy-httpd" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.935775 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a92c760-c810-46be-aeea-f295975b8451" containerName="ceilometer-notification-agent" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.937641 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.940545 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.948478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.963246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.963609 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.963817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.963943 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.964045 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtg4\" (UniqueName: \"kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.964327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:50 crc kubenswrapper[4922]: I0929 10:02:50.964614 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.002162 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.024862 4922 scope.go:117] "RemoveContainer" containerID="3ade7ce3dbc51dd2e09b5b238b6d312edf54f4f76439b3e0bbec3cac3dc1f7d3" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067038 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067100 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067168 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtg4\" (UniqueName: \"kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.067304 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.070451 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.070748 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.074150 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.075902 4922 scope.go:117] "RemoveContainer" containerID="6a7ec4aabb608d8cbb7fa5f0312589bbd3a985d244e1088fd55169af4f2fd950" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.077914 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.079460 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.084224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.094242 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtg4\" (UniqueName: \"kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4\") pod \"ceilometer-0\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.114143 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.251642 4922 scope.go:117] "RemoveContainer" containerID="82b2e0f247a05bbf54288d98d2b691ef17b255235458acf0eb0b3a7124c6e741" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.267948 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.476546 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a92c760-c810-46be-aeea-f295975b8451" path="/var/lib/kubelet/pods/8a92c760-c810-46be-aeea-f295975b8451/volumes" Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.863102 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.906550 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec5073f-9a07-4292-8cfb-62e419a0438d","Type":"ContainerStarted","Data":"9f75253390d2c15915496202c0623c4fde4c32c37d9eab06b1ae7ecc13ece984"} Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.908692 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerStarted","Data":"e8c81f7990c2c7553ef72504e06632f463c07626fcb9367ad3537f6b1006650f"} Sep 29 10:02:51 crc kubenswrapper[4922]: I0929 10:02:51.944477 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.944440671 podStartE2EDuration="4.944440671s" podCreationTimestamp="2025-09-29 10:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:02:51.931915452 +0000 UTC m=+1097.298145746" watchObservedRunningTime="2025-09-29 10:02:51.944440671 +0000 UTC m=+1097.310670935" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.585138 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.596343 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.602434 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.709711 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7g7j\" (UniqueName: \"kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j\") pod \"146980d0-cc5b-44ad-87c7-fd6463f25659\" (UID: \"146980d0-cc5b-44ad-87c7-fd6463f25659\") " Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.710095 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrnl\" (UniqueName: \"kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl\") pod \"d8e9ee79-2a19-4e08-9e57-a1d745c7976e\" (UID: \"d8e9ee79-2a19-4e08-9e57-a1d745c7976e\") " Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.710220 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5gw\" (UniqueName: \"kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw\") pod \"77c731a0-29c8-476c-98a0-3bf96579183f\" (UID: \"77c731a0-29c8-476c-98a0-3bf96579183f\") " Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.718078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j" (OuterVolumeSpecName: "kube-api-access-p7g7j") pod "146980d0-cc5b-44ad-87c7-fd6463f25659" (UID: "146980d0-cc5b-44ad-87c7-fd6463f25659"). InnerVolumeSpecName "kube-api-access-p7g7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.720439 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw" (OuterVolumeSpecName: "kube-api-access-lc5gw") pod "77c731a0-29c8-476c-98a0-3bf96579183f" (UID: "77c731a0-29c8-476c-98a0-3bf96579183f"). InnerVolumeSpecName "kube-api-access-lc5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.721002 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl" (OuterVolumeSpecName: "kube-api-access-cxrnl") pod "d8e9ee79-2a19-4e08-9e57-a1d745c7976e" (UID: "d8e9ee79-2a19-4e08-9e57-a1d745c7976e"). InnerVolumeSpecName "kube-api-access-cxrnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.812950 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrnl\" (UniqueName: \"kubernetes.io/projected/d8e9ee79-2a19-4e08-9e57-a1d745c7976e-kube-api-access-cxrnl\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.813004 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5gw\" (UniqueName: \"kubernetes.io/projected/77c731a0-29c8-476c-98a0-3bf96579183f-kube-api-access-lc5gw\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.813016 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7g7j\" (UniqueName: \"kubernetes.io/projected/146980d0-cc5b-44ad-87c7-fd6463f25659-kube-api-access-p7g7j\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.850761 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.927766 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0045-account-create-p2pzv" event={"ID":"77c731a0-29c8-476c-98a0-3bf96579183f","Type":"ContainerDied","Data":"7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b"} Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.928259 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7392d62506e54bb5bbebf41db2006cfe74e97058b204981503c0a19a297c1b" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.927842 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0045-account-create-p2pzv" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.931208 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ee9-account-create-g6ttg" event={"ID":"146980d0-cc5b-44ad-87c7-fd6463f25659","Type":"ContainerDied","Data":"a7e41743c6d5f022fcce9e2cf33b5b754f3184b6b9dec749954be51865198c9e"} Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.931280 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e41743c6d5f022fcce9e2cf33b5b754f3184b6b9dec749954be51865198c9e" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.931298 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ee9-account-create-g6ttg" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.933381 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerStarted","Data":"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988"} Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.935143 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b33d-account-create-7wvb5" Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.935130 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b33d-account-create-7wvb5" event={"ID":"d8e9ee79-2a19-4e08-9e57-a1d745c7976e","Type":"ContainerDied","Data":"b733d789a41b14fa487c4557612684865cdc99ab3bab872b95a4f59216bfc2dd"} Sep 29 10:02:52 crc kubenswrapper[4922]: I0929 10:02:52.935202 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b733d789a41b14fa487c4557612684865cdc99ab3bab872b95a4f59216bfc2dd" Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.594557 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.595392 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-log" containerID="cri-o://0acd7c1aa28e817ab6423aae8c924696b3f2cc5b923255246f1c10e2f20ab09b" gracePeriod=30 Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.595513 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-httpd" containerID="cri-o://60256daf381d89ac8700d2ba4428ff35b3b18e65d35458e0a160ec52ccfed346" gracePeriod=30 Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.949501 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f597643-851f-448e-996d-5a30b83c535f" containerID="0acd7c1aa28e817ab6423aae8c924696b3f2cc5b923255246f1c10e2f20ab09b" exitCode=143 Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.949596 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerDied","Data":"0acd7c1aa28e817ab6423aae8c924696b3f2cc5b923255246f1c10e2f20ab09b"} Sep 29 10:02:53 crc kubenswrapper[4922]: I0929 10:02:53.952229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerStarted","Data":"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b"} Sep 29 10:02:54 crc kubenswrapper[4922]: I0929 10:02:54.969346 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerStarted","Data":"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f"} Sep 29 10:02:55 crc kubenswrapper[4922]: I0929 10:02:55.990081 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerStarted","Data":"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b"} Sep 29 10:02:55 crc kubenswrapper[4922]: I0929 10:02:55.990876 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="sg-core" containerID="cri-o://42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f" gracePeriod=30 Sep 29 10:02:55 crc kubenswrapper[4922]: I0929 10:02:55.990997 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="proxy-httpd" containerID="cri-o://b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b" gracePeriod=30 Sep 29 10:02:55 crc kubenswrapper[4922]: I0929 10:02:55.991001 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-central-agent" containerID="cri-o://b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988" gracePeriod=30 Sep 29 10:02:55 crc kubenswrapper[4922]: I0929 10:02:55.991091 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-notification-agent" containerID="cri-o://d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b" gracePeriod=30 Sep 29 10:02:56 crc kubenswrapper[4922]: I0929 10:02:56.033034 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.52343552 podStartE2EDuration="6.03300159s" podCreationTimestamp="2025-09-29 10:02:50 +0000 UTC" firstStartedPulling="2025-09-29 10:02:51.874826974 +0000 UTC m=+1097.241057258" lastFinishedPulling="2025-09-29 10:02:55.384393064 +0000 UTC m=+1100.750623328" observedRunningTime="2025-09-29 10:02:56.01789066 +0000 UTC m=+1101.384120924" watchObservedRunningTime="2025-09-29 10:02:56.03300159 +0000 UTC m=+1101.399231854" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.004613 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f597643-851f-448e-996d-5a30b83c535f" containerID="60256daf381d89ac8700d2ba4428ff35b3b18e65d35458e0a160ec52ccfed346" exitCode=0 Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.004669 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerDied","Data":"60256daf381d89ac8700d2ba4428ff35b3b18e65d35458e0a160ec52ccfed346"} Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009282 4922 generic.go:334] "Generic (PLEG): container finished" podID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerID="b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b" exitCode=0 Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009329 4922 generic.go:334] "Generic (PLEG): container finished" podID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerID="42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f" exitCode=2 Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009341 4922 generic.go:334] "Generic (PLEG): container finished" podID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerID="d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b" exitCode=0 Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009397 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerDied","Data":"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b"} Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerDied","Data":"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f"} Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.009516 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerDied","Data":"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b"} Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.313512 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426355 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426482 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426498 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426584 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cmr9\" (UniqueName: \"kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.426711 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle\") pod \"6f597643-851f-448e-996d-5a30b83c535f\" (UID: \"6f597643-851f-448e-996d-5a30b83c535f\") " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.430355 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.430630 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs" (OuterVolumeSpecName: "logs") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.438132 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.459456 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9" (OuterVolumeSpecName: "kube-api-access-5cmr9") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "kube-api-access-5cmr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.461287 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts" (OuterVolumeSpecName: "scripts") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.483208 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.504823 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.507578 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data" (OuterVolumeSpecName: "config-data") pod "6f597643-851f-448e-996d-5a30b83c535f" (UID: "6f597643-851f-448e-996d-5a30b83c535f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531276 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531324 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531333 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f597643-851f-448e-996d-5a30b83c535f-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531343 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531351 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531363 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cmr9\" (UniqueName: \"kubernetes.io/projected/6f597643-851f-448e-996d-5a30b83c535f-kube-api-access-5cmr9\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531392 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.531403 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f597643-851f-448e-996d-5a30b83c535f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.553262 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 29 10:02:57 crc kubenswrapper[4922]: I0929 10:02:57.635662 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.021181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f597643-851f-448e-996d-5a30b83c535f","Type":"ContainerDied","Data":"3940b92a157226472b741e8629dcd006c90e31c6b7ae179fb0165e6bb3ae381e"} Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.021685 4922 scope.go:117] "RemoveContainer" containerID="60256daf381d89ac8700d2ba4428ff35b3b18e65d35458e0a160ec52ccfed346" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.021238 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.069650 4922 scope.go:117] "RemoveContainer" containerID="0acd7c1aa28e817ab6423aae8c924696b3f2cc5b923255246f1c10e2f20ab09b" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.086199 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.101863 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.116733 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:58 crc kubenswrapper[4922]: E0929 10:02:58.117433 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146980d0-cc5b-44ad-87c7-fd6463f25659" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117463 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="146980d0-cc5b-44ad-87c7-fd6463f25659" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: E0929 10:02:58.117514 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e9ee79-2a19-4e08-9e57-a1d745c7976e" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117524 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e9ee79-2a19-4e08-9e57-a1d745c7976e" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: E0929 10:02:58.117541 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c731a0-29c8-476c-98a0-3bf96579183f" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117548 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c731a0-29c8-476c-98a0-3bf96579183f" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: E0929 10:02:58.117568 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-httpd" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117574 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-httpd" Sep 29 10:02:58 crc kubenswrapper[4922]: E0929 10:02:58.117592 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-log" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117598 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-log" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117794 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c731a0-29c8-476c-98a0-3bf96579183f" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.117821 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-httpd" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.124900 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e9ee79-2a19-4e08-9e57-a1d745c7976e" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.124948 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f597643-851f-448e-996d-5a30b83c535f" containerName="glance-log" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.124975 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="146980d0-cc5b-44ad-87c7-fd6463f25659" containerName="mariadb-account-create" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.126360 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.132709 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.133237 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.150866 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247406 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247551 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-scripts\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247589 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-logs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-config-data\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.247744 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2grf\" (UniqueName: \"kubernetes.io/projected/4da22caf-781b-42ef-ad66-521d0908aabb-kube-api-access-f2grf\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.349703 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.349811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.349868 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-scripts\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.349913 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-logs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.350165 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.350443 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.351251 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4da22caf-781b-42ef-ad66-521d0908aabb-logs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.351310 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-config-data\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.351380 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.351418 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.351467 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2grf\" (UniqueName: \"kubernetes.io/projected/4da22caf-781b-42ef-ad66-521d0908aabb-kube-api-access-f2grf\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.357127 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.358611 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-scripts\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.364666 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.365384 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da22caf-781b-42ef-ad66-521d0908aabb-config-data\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.378513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2grf\" (UniqueName: \"kubernetes.io/projected/4da22caf-781b-42ef-ad66-521d0908aabb-kube-api-access-f2grf\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.398957 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.399020 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.423595 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4da22caf-781b-42ef-ad66-521d0908aabb\") " pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.437803 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.452762 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.471511 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.678526 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cfqkk"] Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.680037 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.687237 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nk5rh" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.687437 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.690526 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.700455 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cfqkk"] Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.762603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6b5s\" (UniqueName: \"kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.762675 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.762822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.762941 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.864497 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.865394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.865551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.865650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6b5s\" (UniqueName: \"kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.872954 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.873166 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.874183 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:58 crc kubenswrapper[4922]: I0929 10:02:58.885713 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6b5s\" (UniqueName: \"kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s\") pod \"nova-cell0-conductor-db-sync-cfqkk\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.011298 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.043939 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.043979 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.074060 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.074130 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.074196 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.075090 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.075155 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2" gracePeriod=600 Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.289684 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.480434 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f597643-851f-448e-996d-5a30b83c535f" path="/var/lib/kubelet/pods/6f597643-851f-448e-996d-5a30b83c535f/volumes" Sep 29 10:02:59 crc kubenswrapper[4922]: I0929 10:02:59.572193 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cfqkk"] Sep 29 10:03:00 crc kubenswrapper[4922]: I0929 10:03:00.056473 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" event={"ID":"e1a157db-ed95-45a6-9e10-acad67ba9e0f","Type":"ContainerStarted","Data":"16a567a1d5fedd5c85bd9426fd10c26db20123da430e267428028cd46a8620b5"} Sep 29 10:03:00 crc kubenswrapper[4922]: I0929 10:03:00.060588 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4da22caf-781b-42ef-ad66-521d0908aabb","Type":"ContainerStarted","Data":"d0567d12cee2d671f51aff2552865e4e6e0da20fa1db9c48d3f020fc81ff8551"} Sep 29 10:03:00 crc kubenswrapper[4922]: I0929 10:03:00.065855 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2" exitCode=0 Sep 29 10:03:00 crc kubenswrapper[4922]: I0929 10:03:00.066090 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2"} Sep 29 10:03:00 crc kubenswrapper[4922]: I0929 10:03:00.066177 4922 scope.go:117] "RemoveContainer" containerID="7884bf02a997a61f9124b5ac0faf1322742549dc99578bbb4ee5d6c1d6b88217" Sep 29 10:03:01 crc kubenswrapper[4922]: I0929 10:03:01.082963 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:03:01 crc kubenswrapper[4922]: I0929 10:03:01.083355 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:03:01 crc kubenswrapper[4922]: I0929 10:03:01.509087 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:03:01 crc kubenswrapper[4922]: I0929 10:03:01.509159 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.879872 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.958938 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959241 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtg4\" (UniqueName: \"kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959449 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959489 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959560 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.959723 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle\") pod \"afa3df04-1c02-486b-b02f-12a9c40aedb8\" (UID: \"afa3df04-1c02-486b-b02f-12a9c40aedb8\") " Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.962620 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.963358 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.975668 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4" (OuterVolumeSpecName: "kube-api-access-hhtg4") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "kube-api-access-hhtg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.984709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:02 crc kubenswrapper[4922]: I0929 10:03:02.984898 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts" (OuterVolumeSpecName: "scripts") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.030209 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.068541 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.068577 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhtg4\" (UniqueName: \"kubernetes.io/projected/afa3df04-1c02-486b-b02f-12a9c40aedb8-kube-api-access-hhtg4\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.068590 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afa3df04-1c02-486b-b02f-12a9c40aedb8-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.068598 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.118721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4da22caf-781b-42ef-ad66-521d0908aabb","Type":"ContainerStarted","Data":"c8e5a3e90df0508dd657cfff310e277640a6febe911f352245940a1e959fb902"} Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.124382 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c"} Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.132442 4922 generic.go:334] "Generic (PLEG): container finished" podID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerID="b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988" exitCode=0 Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.132507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerDied","Data":"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988"} Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.132542 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afa3df04-1c02-486b-b02f-12a9c40aedb8","Type":"ContainerDied","Data":"e8c81f7990c2c7553ef72504e06632f463c07626fcb9367ad3537f6b1006650f"} Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.132562 4922 scope.go:117] "RemoveContainer" containerID="b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.132690 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.169946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data" (OuterVolumeSpecName: "config-data") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.170226 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afa3df04-1c02-486b-b02f-12a9c40aedb8" (UID: "afa3df04-1c02-486b-b02f-12a9c40aedb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.171953 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.171998 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3df04-1c02-486b-b02f-12a9c40aedb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.180332 4922 scope.go:117] "RemoveContainer" containerID="42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.226900 4922 scope.go:117] "RemoveContainer" containerID="d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.254479 4922 scope.go:117] "RemoveContainer" containerID="b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.299979 4922 scope.go:117] "RemoveContainer" containerID="b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.301599 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b\": container with ID starting with b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b not found: ID does not exist" containerID="b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.301642 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b"} err="failed to get container status \"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b\": rpc error: code = NotFound desc = could not find container \"b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b\": container with ID starting with b9bb6a77a2ea3cba5717c35e96aaa1f9f65012180f3063a691eb226c4e5fa08b not found: ID does not exist" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.301691 4922 scope.go:117] "RemoveContainer" containerID="42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.303182 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f\": container with ID starting with 42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f not found: ID does not exist" containerID="42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.303227 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f"} err="failed to get container status \"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f\": rpc error: code = NotFound desc = could not find container \"42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f\": container with ID starting with 42616268cb980a08c2e94aff3669e57251da7b227f29c5014ea30000965c4f9f not found: ID does not exist" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.303243 4922 scope.go:117] "RemoveContainer" containerID="d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.303691 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b\": container with ID starting with d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b not found: ID does not exist" containerID="d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.303738 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b"} err="failed to get container status \"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b\": rpc error: code = NotFound desc = could not find container \"d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b\": container with ID starting with d363f619c82420f0301dd01b9001b7bb65ae2af746a71f39dc954c9161fc2e2b not found: ID does not exist" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.303755 4922 scope.go:117] "RemoveContainer" containerID="b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.304084 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988\": container with ID starting with b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988 not found: ID does not exist" containerID="b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.304106 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988"} err="failed to get container status \"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988\": rpc error: code = NotFound desc = could not find container \"b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988\": container with ID starting with b3d9e2e711fe70360515016465da7ed9720869a5627c13a5461149c1f3f44988 not found: ID does not exist" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.475443 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.484423 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.503221 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.504976 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-notification-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505102 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-notification-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.505236 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-central-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505296 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-central-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.505361 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="sg-core" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505417 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="sg-core" Sep 29 10:03:03 crc kubenswrapper[4922]: E0929 10:03:03.505487 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="proxy-httpd" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505541 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="proxy-httpd" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505846 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-notification-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505918 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="sg-core" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.505987 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="proxy-httpd" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.506047 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" containerName="ceilometer-central-agent" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.508705 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.519289 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.519389 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.538388 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.684878 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685077 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685137 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4tvz\" (UniqueName: \"kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685180 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685274 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685306 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.685363 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4tvz\" (UniqueName: \"kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.786735 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.787182 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.788616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.793502 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.794824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.797656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.797902 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.807714 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4tvz\" (UniqueName: \"kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz\") pod \"ceilometer-0\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " pod="openstack/ceilometer-0" Sep 29 10:03:03 crc kubenswrapper[4922]: I0929 10:03:03.837819 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:04 crc kubenswrapper[4922]: I0929 10:03:04.188013 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4da22caf-781b-42ef-ad66-521d0908aabb","Type":"ContainerStarted","Data":"824796ae1bc351596dc7a0ed8d841d7e35ba3668f1fcba622c867af2f309087f"} Sep 29 10:03:04 crc kubenswrapper[4922]: I0929 10:03:04.513267 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.513241646 podStartE2EDuration="6.513241646s" podCreationTimestamp="2025-09-29 10:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:04.24432933 +0000 UTC m=+1109.610559584" watchObservedRunningTime="2025-09-29 10:03:04.513241646 +0000 UTC m=+1109.879471910" Sep 29 10:03:04 crc kubenswrapper[4922]: I0929 10:03:04.526311 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:04 crc kubenswrapper[4922]: I0929 10:03:04.874843 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:05 crc kubenswrapper[4922]: I0929 10:03:05.234549 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerStarted","Data":"bff9a882496053487e7a78c8798c902aba9950a5b8ee452bee487b4769e9bd67"} Sep 29 10:03:05 crc kubenswrapper[4922]: I0929 10:03:05.466628 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa3df04-1c02-486b-b02f-12a9c40aedb8" path="/var/lib/kubelet/pods/afa3df04-1c02-486b-b02f-12a9c40aedb8/volumes" Sep 29 10:03:06 crc kubenswrapper[4922]: I0929 10:03:06.246032 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerStarted","Data":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} Sep 29 10:03:08 crc kubenswrapper[4922]: I0929 10:03:08.454041 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:03:08 crc kubenswrapper[4922]: I0929 10:03:08.454696 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 10:03:08 crc kubenswrapper[4922]: I0929 10:03:08.520616 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:03:08 crc kubenswrapper[4922]: I0929 10:03:08.521297 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 10:03:09 crc kubenswrapper[4922]: I0929 10:03:09.278769 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:03:09 crc kubenswrapper[4922]: I0929 10:03:09.279240 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 10:03:11 crc kubenswrapper[4922]: I0929 10:03:11.298279 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:03:11 crc kubenswrapper[4922]: I0929 10:03:11.299064 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 10:03:11 crc kubenswrapper[4922]: I0929 10:03:11.326091 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:03:11 crc kubenswrapper[4922]: I0929 10:03:11.381793 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 10:03:12 crc kubenswrapper[4922]: I0929 10:03:12.315877 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" event={"ID":"e1a157db-ed95-45a6-9e10-acad67ba9e0f","Type":"ContainerStarted","Data":"1ca57c8afc1ea282b4480c13a992e9caffcc1f5421c18914065e59c8ea52172c"} Sep 29 10:03:12 crc kubenswrapper[4922]: I0929 10:03:12.320669 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerStarted","Data":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} Sep 29 10:03:12 crc kubenswrapper[4922]: I0929 10:03:12.350521 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" podStartSLOduration=2.750396549 podStartE2EDuration="14.350493448s" podCreationTimestamp="2025-09-29 10:02:58 +0000 UTC" firstStartedPulling="2025-09-29 10:02:59.569548442 +0000 UTC m=+1104.935778696" lastFinishedPulling="2025-09-29 10:03:11.169645331 +0000 UTC m=+1116.535875595" observedRunningTime="2025-09-29 10:03:12.333369194 +0000 UTC m=+1117.699599498" watchObservedRunningTime="2025-09-29 10:03:12.350493448 +0000 UTC m=+1117.716723752" Sep 29 10:03:13 crc kubenswrapper[4922]: I0929 10:03:13.332903 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerStarted","Data":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.359134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerStarted","Data":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.359793 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.359502 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-central-agent" containerID="cri-o://aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" gracePeriod=30 Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.359942 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="proxy-httpd" containerID="cri-o://8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" gracePeriod=30 Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.359957 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="sg-core" containerID="cri-o://8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" gracePeriod=30 Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.360026 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-notification-agent" containerID="cri-o://deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" gracePeriod=30 Sep 29 10:03:15 crc kubenswrapper[4922]: I0929 10:03:15.389255 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.198213015 podStartE2EDuration="12.38923467s" podCreationTimestamp="2025-09-29 10:03:03 +0000 UTC" firstStartedPulling="2025-09-29 10:03:04.539168025 +0000 UTC m=+1109.905398279" lastFinishedPulling="2025-09-29 10:03:14.73018966 +0000 UTC m=+1120.096419934" observedRunningTime="2025-09-29 10:03:15.386859086 +0000 UTC m=+1120.753089370" watchObservedRunningTime="2025-09-29 10:03:15.38923467 +0000 UTC m=+1120.755464944" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.267002 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.372922 4922 generic.go:334] "Generic (PLEG): container finished" podID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" exitCode=0 Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.372977 4922 generic.go:334] "Generic (PLEG): container finished" podID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" exitCode=2 Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373011 4922 generic.go:334] "Generic (PLEG): container finished" podID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" exitCode=0 Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373017 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373041 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerDied","Data":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373079 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerDied","Data":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373092 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerDied","Data":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerDied","Data":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373126 4922 scope.go:117] "RemoveContainer" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373023 4922 generic.go:334] "Generic (PLEG): container finished" podID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" exitCode=0 Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.373913 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6895d863-35fc-41ab-a083-76c6c00c9dfa","Type":"ContainerDied","Data":"bff9a882496053487e7a78c8798c902aba9950a5b8ee452bee487b4769e9bd67"} Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.409035 4922 scope.go:117] "RemoveContainer" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.435115 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4tvz\" (UniqueName: \"kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.435199 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.435278 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.436054 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.436185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.436239 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.436269 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.436903 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.437098 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml\") pod \"6895d863-35fc-41ab-a083-76c6c00c9dfa\" (UID: \"6895d863-35fc-41ab-a083-76c6c00c9dfa\") " Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.439530 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.439731 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6895d863-35fc-41ab-a083-76c6c00c9dfa-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.441964 4922 scope.go:117] "RemoveContainer" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.444559 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz" (OuterVolumeSpecName: "kube-api-access-r4tvz") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "kube-api-access-r4tvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.445792 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts" (OuterVolumeSpecName: "scripts") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.471980 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.518531 4922 scope.go:117] "RemoveContainer" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.541463 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4tvz\" (UniqueName: \"kubernetes.io/projected/6895d863-35fc-41ab-a083-76c6c00c9dfa-kube-api-access-r4tvz\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.541510 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.541524 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.547133 4922 scope.go:117] "RemoveContainer" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.548086 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": container with ID starting with 8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018 not found: ID does not exist" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.548130 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} err="failed to get container status \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": rpc error: code = NotFound desc = could not find container \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": container with ID starting with 8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.548160 4922 scope.go:117] "RemoveContainer" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.548584 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": container with ID starting with 8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce not found: ID does not exist" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.548611 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} err="failed to get container status \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": rpc error: code = NotFound desc = could not find container \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": container with ID starting with 8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.548628 4922 scope.go:117] "RemoveContainer" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.548984 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": container with ID starting with deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6 not found: ID does not exist" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.549040 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} err="failed to get container status \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": rpc error: code = NotFound desc = could not find container \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": container with ID starting with deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.549084 4922 scope.go:117] "RemoveContainer" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.549420 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": container with ID starting with aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333 not found: ID does not exist" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.549454 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} err="failed to get container status \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": rpc error: code = NotFound desc = could not find container \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": container with ID starting with aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.549472 4922 scope.go:117] "RemoveContainer" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.550963 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} err="failed to get container status \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": rpc error: code = NotFound desc = could not find container \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": container with ID starting with 8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.551028 4922 scope.go:117] "RemoveContainer" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.551399 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} err="failed to get container status \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": rpc error: code = NotFound desc = could not find container \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": container with ID starting with 8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.551422 4922 scope.go:117] "RemoveContainer" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.551723 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} err="failed to get container status \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": rpc error: code = NotFound desc = could not find container \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": container with ID starting with deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.551777 4922 scope.go:117] "RemoveContainer" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552038 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} err="failed to get container status \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": rpc error: code = NotFound desc = could not find container \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": container with ID starting with aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552057 4922 scope.go:117] "RemoveContainer" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552317 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} err="failed to get container status \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": rpc error: code = NotFound desc = could not find container \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": container with ID starting with 8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552363 4922 scope.go:117] "RemoveContainer" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552635 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} err="failed to get container status \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": rpc error: code = NotFound desc = could not find container \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": container with ID starting with 8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.552660 4922 scope.go:117] "RemoveContainer" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553028 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} err="failed to get container status \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": rpc error: code = NotFound desc = could not find container \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": container with ID starting with deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553062 4922 scope.go:117] "RemoveContainer" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553399 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} err="failed to get container status \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": rpc error: code = NotFound desc = could not find container \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": container with ID starting with aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553447 4922 scope.go:117] "RemoveContainer" containerID="8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553722 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018"} err="failed to get container status \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": rpc error: code = NotFound desc = could not find container \"8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018\": container with ID starting with 8c3667a676e6f979181d205008ab43601e1f3863514c1260f9b2d9c85cfe4018 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.553748 4922 scope.go:117] "RemoveContainer" containerID="8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.554051 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce"} err="failed to get container status \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": rpc error: code = NotFound desc = could not find container \"8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce\": container with ID starting with 8804d8d4d4e0052e66c06129cb39ff7548dc388141de68357d051c1a05f867ce not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.554077 4922 scope.go:117] "RemoveContainer" containerID="deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.554381 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6"} err="failed to get container status \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": rpc error: code = NotFound desc = could not find container \"deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6\": container with ID starting with deeb06eb73b87ed97b32e12c43dc8f0eed6667246f102ccaed4b6e653c9303c6 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.554409 4922 scope.go:117] "RemoveContainer" containerID="aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.554681 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333"} err="failed to get container status \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": rpc error: code = NotFound desc = could not find container \"aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333\": container with ID starting with aa6104c4bb973c53afca6db97443fbb928375be1c0e735ffc8fbaeca47ada333 not found: ID does not exist" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.563749 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data" (OuterVolumeSpecName: "config-data") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.564314 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6895d863-35fc-41ab-a083-76c6c00c9dfa" (UID: "6895d863-35fc-41ab-a083-76c6c00c9dfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.643290 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.643342 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6895d863-35fc-41ab-a083-76c6c00c9dfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.712352 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.724379 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.741151 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.741719 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="proxy-httpd" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.741737 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="proxy-httpd" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.741756 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-notification-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.741764 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-notification-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.741781 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-central-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.741789 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-central-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: E0929 10:03:16.741853 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="sg-core" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.741862 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="sg-core" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.742106 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="sg-core" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.742121 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="proxy-httpd" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.742137 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-central-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.742146 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" containerName="ceilometer-notification-agent" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.744188 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.745992 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746063 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746160 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746187 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746230 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.746267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpnw6\" (UniqueName: \"kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.749998 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.750323 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.762934 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848062 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848144 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpnw6\" (UniqueName: \"kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848224 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848277 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848375 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848411 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848439 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.848703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.849132 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.854539 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.854604 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.855887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.857956 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:16 crc kubenswrapper[4922]: I0929 10:03:16.874976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpnw6\" (UniqueName: \"kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6\") pod \"ceilometer-0\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " pod="openstack/ceilometer-0" Sep 29 10:03:17 crc kubenswrapper[4922]: I0929 10:03:17.119134 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:17 crc kubenswrapper[4922]: I0929 10:03:17.465123 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6895d863-35fc-41ab-a083-76c6c00c9dfa" path="/var/lib/kubelet/pods/6895d863-35fc-41ab-a083-76c6c00c9dfa/volumes" Sep 29 10:03:17 crc kubenswrapper[4922]: I0929 10:03:17.723740 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:18 crc kubenswrapper[4922]: I0929 10:03:18.423314 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerStarted","Data":"5e378ddb3ab81c95aa9de229c4e89d71260f39b7ebc80c2719083bd52ba0bf02"} Sep 29 10:03:19 crc kubenswrapper[4922]: I0929 10:03:19.435020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerStarted","Data":"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923"} Sep 29 10:03:20 crc kubenswrapper[4922]: I0929 10:03:20.451706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerStarted","Data":"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6"} Sep 29 10:03:21 crc kubenswrapper[4922]: I0929 10:03:21.469108 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerStarted","Data":"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f"} Sep 29 10:03:22 crc kubenswrapper[4922]: I0929 10:03:22.487741 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerStarted","Data":"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f"} Sep 29 10:03:22 crc kubenswrapper[4922]: I0929 10:03:22.488723 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:03:22 crc kubenswrapper[4922]: I0929 10:03:22.519624 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.667623759 podStartE2EDuration="6.519602821s" podCreationTimestamp="2025-09-29 10:03:16 +0000 UTC" firstStartedPulling="2025-09-29 10:03:17.748533639 +0000 UTC m=+1123.114763903" lastFinishedPulling="2025-09-29 10:03:21.600512691 +0000 UTC m=+1126.966742965" observedRunningTime="2025-09-29 10:03:22.514997729 +0000 UTC m=+1127.881228003" watchObservedRunningTime="2025-09-29 10:03:22.519602821 +0000 UTC m=+1127.885833095" Sep 29 10:03:25 crc kubenswrapper[4922]: I0929 10:03:25.528734 4922 generic.go:334] "Generic (PLEG): container finished" podID="e1a157db-ed95-45a6-9e10-acad67ba9e0f" containerID="1ca57c8afc1ea282b4480c13a992e9caffcc1f5421c18914065e59c8ea52172c" exitCode=0 Sep 29 10:03:25 crc kubenswrapper[4922]: I0929 10:03:25.528865 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" event={"ID":"e1a157db-ed95-45a6-9e10-acad67ba9e0f","Type":"ContainerDied","Data":"1ca57c8afc1ea282b4480c13a992e9caffcc1f5421c18914065e59c8ea52172c"} Sep 29 10:03:26 crc kubenswrapper[4922]: I0929 10:03:26.983018 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.068756 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle\") pod \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.068884 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data\") pod \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.069062 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6b5s\" (UniqueName: \"kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s\") pod \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.069086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts\") pod \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\" (UID: \"e1a157db-ed95-45a6-9e10-acad67ba9e0f\") " Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.078733 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts" (OuterVolumeSpecName: "scripts") pod "e1a157db-ed95-45a6-9e10-acad67ba9e0f" (UID: "e1a157db-ed95-45a6-9e10-acad67ba9e0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.079709 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s" (OuterVolumeSpecName: "kube-api-access-n6b5s") pod "e1a157db-ed95-45a6-9e10-acad67ba9e0f" (UID: "e1a157db-ed95-45a6-9e10-acad67ba9e0f"). InnerVolumeSpecName "kube-api-access-n6b5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.108020 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data" (OuterVolumeSpecName: "config-data") pod "e1a157db-ed95-45a6-9e10-acad67ba9e0f" (UID: "e1a157db-ed95-45a6-9e10-acad67ba9e0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.122926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1a157db-ed95-45a6-9e10-acad67ba9e0f" (UID: "e1a157db-ed95-45a6-9e10-acad67ba9e0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.172601 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.173096 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.173117 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6b5s\" (UniqueName: \"kubernetes.io/projected/e1a157db-ed95-45a6-9e10-acad67ba9e0f-kube-api-access-n6b5s\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.173127 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a157db-ed95-45a6-9e10-acad67ba9e0f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.553967 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" event={"ID":"e1a157db-ed95-45a6-9e10-acad67ba9e0f","Type":"ContainerDied","Data":"16a567a1d5fedd5c85bd9426fd10c26db20123da430e267428028cd46a8620b5"} Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.554030 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16a567a1d5fedd5c85bd9426fd10c26db20123da430e267428028cd46a8620b5" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.554101 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cfqkk" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.762401 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:03:27 crc kubenswrapper[4922]: E0929 10:03:27.763491 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a157db-ed95-45a6-9e10-acad67ba9e0f" containerName="nova-cell0-conductor-db-sync" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.763581 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a157db-ed95-45a6-9e10-acad67ba9e0f" containerName="nova-cell0-conductor-db-sync" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.763865 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a157db-ed95-45a6-9e10-acad67ba9e0f" containerName="nova-cell0-conductor-db-sync" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.764743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.768519 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nk5rh" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.768749 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.771493 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.892249 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-kube-api-access-gxgll\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.892348 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.892987 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.995080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-kube-api-access-gxgll\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.995189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:27 crc kubenswrapper[4922]: I0929 10:03:27.995488 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.002680 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.003107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.027504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgll\" (UniqueName: \"kubernetes.io/projected/e1139d93-2038-4fa3-b31c-1e7ddedd0bb7-kube-api-access-gxgll\") pod \"nova-cell0-conductor-0\" (UID: \"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7\") " pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.116825 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.407899 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 10:03:28 crc kubenswrapper[4922]: I0929 10:03:28.570604 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7","Type":"ContainerStarted","Data":"3979e363d7df0244722010cd1a0a0c2361a1d6b014e77071ad525d3ec1abde2e"} Sep 29 10:03:29 crc kubenswrapper[4922]: I0929 10:03:29.584699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e1139d93-2038-4fa3-b31c-1e7ddedd0bb7","Type":"ContainerStarted","Data":"82b5ecdd7219670d86cc278e98555e995b9ab6a50b7b62f94938e95aed4c4360"} Sep 29 10:03:30 crc kubenswrapper[4922]: I0929 10:03:30.614810 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.173747 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.219933 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=6.219895292 podStartE2EDuration="6.219895292s" podCreationTimestamp="2025-09-29 10:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:29.610906378 +0000 UTC m=+1134.977136642" watchObservedRunningTime="2025-09-29 10:03:33.219895292 +0000 UTC m=+1138.586125596" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.737523 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8fjc9"] Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.740080 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.742930 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.744481 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.753594 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8fjc9"] Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.840419 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5wm\" (UniqueName: \"kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.840506 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.840597 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.841685 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.943371 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.943462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.943540 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5wm\" (UniqueName: \"kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.943578 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.951749 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.952026 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.958235 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.963096 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.964986 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.968409 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.979175 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5wm\" (UniqueName: \"kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm\") pod \"nova-cell0-cell-mapping-8fjc9\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:33 crc kubenswrapper[4922]: I0929 10:03:33.987174 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.042122 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.044410 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.045643 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9p7\" (UniqueName: \"kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.045734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.045783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.047949 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.067249 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.069108 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.144238 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.146487 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148387 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9p7\" (UniqueName: \"kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148452 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148494 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148523 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148568 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlcd\" (UniqueName: \"kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148592 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.148611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.153951 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.165615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.174588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.204013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9p7\" (UniqueName: \"kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.215232 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.241291 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250589 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thlcd\" (UniqueName: \"kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250680 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9jw5\" (UniqueName: \"kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.250823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.251066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.257276 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.257974 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.284598 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlcd\" (UniqueName: \"kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd\") pod \"nova-api-0\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.354024 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9jw5\" (UniqueName: \"kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.354165 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.354276 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.360081 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.395409 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.396850 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.398368 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.402572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9jw5\" (UniqueName: \"kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5\") pod \"nova-scheduler-0\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.406558 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.426082 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.450995 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.460189 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.472729 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.560030 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563477 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbnf\" (UniqueName: \"kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxgk\" (UniqueName: \"kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563567 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563597 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563847 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.563951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.573224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.667218 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.667959 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668041 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668070 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbnf\" (UniqueName: \"kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czxgk\" (UniqueName: \"kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668131 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668159 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668445 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668470 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.669794 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.668407 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.670451 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.670809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.670991 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.671409 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.674194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.677665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.688460 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxgk\" (UniqueName: \"kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk\") pod \"dnsmasq-dns-845d6d6f59-s9zj6\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.691108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbnf\" (UniqueName: \"kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf\") pod \"nova-metadata-0\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.745181 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.814589 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.820736 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8fjc9"] Sep 29 10:03:34 crc kubenswrapper[4922]: W0929 10:03:34.861375 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93689363_9408_4bc9_b502_0471871ff5ba.slice/crio-f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d WatchSource:0}: Error finding container f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d: Status 404 returned error can't find the container with id f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d Sep 29 10:03:34 crc kubenswrapper[4922]: I0929 10:03:34.950497 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.065121 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85szq"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.067867 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.077702 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.077729 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.080574 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85szq"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.182875 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.182955 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.183082 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.183113 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqmj\" (UniqueName: \"kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.232411 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.288068 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.288662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.288854 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.288909 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqmj\" (UniqueName: \"kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.299805 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.307064 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.308309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.313344 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqmj\" (UniqueName: \"kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj\") pod \"nova-cell1-conductor-db-sync-85szq\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.404494 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.422527 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.554422 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.659152 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.748754 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerStarted","Data":"ff01ac51476f074bbfae4477fd60427cd685f8136e8375ef5f99136f16a2b11f"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.751860 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8fjc9" event={"ID":"93689363-9408-4bc9-b502-0471871ff5ba","Type":"ContainerStarted","Data":"80d6d14c1fefb18655c259c0a92b49b4159e49ad90f7ce983642396bea4a7655"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.751951 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8fjc9" event={"ID":"93689363-9408-4bc9-b502-0471871ff5ba","Type":"ContainerStarted","Data":"f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.761416 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerStarted","Data":"00cadae9eb254af0ea30e1d7698369242e5b7694c9dcaca0ba8c8373b2d8d944"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.764213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" event={"ID":"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9","Type":"ContainerStarted","Data":"34d5a66ecd4a5252e847a91eab082ab696a0d5bc2669332b4e720df0df67ce3b"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.765695 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb8a0865-0468-4195-a132-ba1fbc0b48a9","Type":"ContainerStarted","Data":"91ac45de35d8065be0523cff1da4ae4682e837464126597b06220006857c5de6"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.768198 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d2eb931-8920-4306-81db-b71e9162dbb3","Type":"ContainerStarted","Data":"7354c52640e0cdfd6ba1053fd6029783cb98df77337a32069af175116fd7c405"} Sep 29 10:03:35 crc kubenswrapper[4922]: I0929 10:03:35.782502 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8fjc9" podStartSLOduration=2.782467157 podStartE2EDuration="2.782467157s" podCreationTimestamp="2025-09-29 10:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:35.771213358 +0000 UTC m=+1141.137443612" watchObservedRunningTime="2025-09-29 10:03:35.782467157 +0000 UTC m=+1141.148697421" Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.087102 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85szq"] Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.785628 4922 generic.go:334] "Generic (PLEG): container finished" podID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerID="b2404176b3c9df44dc034538dbe0f5b56947169132271ac6662b419cf382050b" exitCode=0 Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.785721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" event={"ID":"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9","Type":"ContainerDied","Data":"b2404176b3c9df44dc034538dbe0f5b56947169132271ac6662b419cf382050b"} Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.788948 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85szq" event={"ID":"7807d04e-1d92-4727-9cad-6504967c92ad","Type":"ContainerStarted","Data":"b7be3d7d3a50aabc809cfb0b5f4b2241e494282d6fefd0ef2e6316ce4e222470"} Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.789024 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85szq" event={"ID":"7807d04e-1d92-4727-9cad-6504967c92ad","Type":"ContainerStarted","Data":"e53080930c6bbad30e71d5d09cd7dbb6b96b3145c52c3bdf32a8b422c48395c5"} Sep 29 10:03:36 crc kubenswrapper[4922]: I0929 10:03:36.846820 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-85szq" podStartSLOduration=1.846793361 podStartE2EDuration="1.846793361s" podCreationTimestamp="2025-09-29 10:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:36.836030236 +0000 UTC m=+1142.202260510" watchObservedRunningTime="2025-09-29 10:03:36.846793361 +0000 UTC m=+1142.213023615" Sep 29 10:03:37 crc kubenswrapper[4922]: I0929 10:03:37.818475 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:37 crc kubenswrapper[4922]: I0929 10:03:37.841503 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.916874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb8a0865-0468-4195-a132-ba1fbc0b48a9","Type":"ContainerStarted","Data":"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.921568 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d2eb931-8920-4306-81db-b71e9162dbb3","Type":"ContainerStarted","Data":"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.921672 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6d2eb931-8920-4306-81db-b71e9162dbb3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d" gracePeriod=30 Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.925078 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerStarted","Data":"f9718ee8b1877eac30dadb29e70510e4ee5ef946cbad4750a51da533083af25d"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.925132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerStarted","Data":"6803a3b1183e897ee335a27051dfdedc2609e621ecc98e5c4b3bafb80715e374"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.930273 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerStarted","Data":"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.930317 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerStarted","Data":"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.930527 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-metadata" containerID="cri-o://5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" gracePeriod=30 Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.930503 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-log" containerID="cri-o://061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" gracePeriod=30 Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.945752 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.54752585 podStartE2EDuration="5.94572717s" podCreationTimestamp="2025-09-29 10:03:34 +0000 UTC" firstStartedPulling="2025-09-29 10:03:35.4249663 +0000 UTC m=+1140.791196564" lastFinishedPulling="2025-09-29 10:03:38.82316762 +0000 UTC m=+1144.189397884" observedRunningTime="2025-09-29 10:03:39.936443894 +0000 UTC m=+1145.302674158" watchObservedRunningTime="2025-09-29 10:03:39.94572717 +0000 UTC m=+1145.311957434" Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.951661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" event={"ID":"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9","Type":"ContainerStarted","Data":"e14fdb608d43bf1cb8d93b4c4b614ebc31df24a27e99b910ec49ce0d1252a32f"} Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.952238 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.960144 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.3912114620000002 podStartE2EDuration="6.960120972s" podCreationTimestamp="2025-09-29 10:03:33 +0000 UTC" firstStartedPulling="2025-09-29 10:03:35.253643203 +0000 UTC m=+1140.619873467" lastFinishedPulling="2025-09-29 10:03:38.822552713 +0000 UTC m=+1144.188782977" observedRunningTime="2025-09-29 10:03:39.959966818 +0000 UTC m=+1145.326197082" watchObservedRunningTime="2025-09-29 10:03:39.960120972 +0000 UTC m=+1145.326351236" Sep 29 10:03:39 crc kubenswrapper[4922]: I0929 10:03:39.976783 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.113245004 podStartE2EDuration="6.976751274s" podCreationTimestamp="2025-09-29 10:03:33 +0000 UTC" firstStartedPulling="2025-09-29 10:03:34.957082492 +0000 UTC m=+1140.323312756" lastFinishedPulling="2025-09-29 10:03:38.820588762 +0000 UTC m=+1144.186819026" observedRunningTime="2025-09-29 10:03:39.975702816 +0000 UTC m=+1145.341933080" watchObservedRunningTime="2025-09-29 10:03:39.976751274 +0000 UTC m=+1145.342981538" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.006440 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.818300266 podStartE2EDuration="6.006418981s" podCreationTimestamp="2025-09-29 10:03:34 +0000 UTC" firstStartedPulling="2025-09-29 10:03:35.65106534 +0000 UTC m=+1141.017295604" lastFinishedPulling="2025-09-29 10:03:38.839184055 +0000 UTC m=+1144.205414319" observedRunningTime="2025-09-29 10:03:39.994916515 +0000 UTC m=+1145.361146789" watchObservedRunningTime="2025-09-29 10:03:40.006418981 +0000 UTC m=+1145.372649245" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.027481 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" podStartSLOduration=6.027450399 podStartE2EDuration="6.027450399s" podCreationTimestamp="2025-09-29 10:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:40.017587607 +0000 UTC m=+1145.383817871" watchObservedRunningTime="2025-09-29 10:03:40.027450399 +0000 UTC m=+1145.393680663" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.565671 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.670084 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzbnf\" (UniqueName: \"kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf\") pod \"5cba84e2-e9c9-461b-86a9-57199d75496a\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.670286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle\") pod \"5cba84e2-e9c9-461b-86a9-57199d75496a\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.672147 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data\") pod \"5cba84e2-e9c9-461b-86a9-57199d75496a\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.672192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs\") pod \"5cba84e2-e9c9-461b-86a9-57199d75496a\" (UID: \"5cba84e2-e9c9-461b-86a9-57199d75496a\") " Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.672798 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs" (OuterVolumeSpecName: "logs") pod "5cba84e2-e9c9-461b-86a9-57199d75496a" (UID: "5cba84e2-e9c9-461b-86a9-57199d75496a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.673456 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cba84e2-e9c9-461b-86a9-57199d75496a-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.677933 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf" (OuterVolumeSpecName: "kube-api-access-mzbnf") pod "5cba84e2-e9c9-461b-86a9-57199d75496a" (UID: "5cba84e2-e9c9-461b-86a9-57199d75496a"). InnerVolumeSpecName "kube-api-access-mzbnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.712252 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data" (OuterVolumeSpecName: "config-data") pod "5cba84e2-e9c9-461b-86a9-57199d75496a" (UID: "5cba84e2-e9c9-461b-86a9-57199d75496a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.712861 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cba84e2-e9c9-461b-86a9-57199d75496a" (UID: "5cba84e2-e9c9-461b-86a9-57199d75496a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.775811 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.775880 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzbnf\" (UniqueName: \"kubernetes.io/projected/5cba84e2-e9c9-461b-86a9-57199d75496a-kube-api-access-mzbnf\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.775896 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cba84e2-e9c9-461b-86a9-57199d75496a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.973783 4922 generic.go:334] "Generic (PLEG): container finished" podID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerID="5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" exitCode=0 Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.974221 4922 generic.go:334] "Generic (PLEG): container finished" podID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerID="061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" exitCode=143 Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.974005 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.973877 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerDied","Data":"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1"} Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.974361 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerDied","Data":"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286"} Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.974394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5cba84e2-e9c9-461b-86a9-57199d75496a","Type":"ContainerDied","Data":"00cadae9eb254af0ea30e1d7698369242e5b7694c9dcaca0ba8c8373b2d8d944"} Sep 29 10:03:40 crc kubenswrapper[4922]: I0929 10:03:40.974429 4922 scope.go:117] "RemoveContainer" containerID="5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.009806 4922 scope.go:117] "RemoveContainer" containerID="061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.037671 4922 scope.go:117] "RemoveContainer" containerID="5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" Sep 29 10:03:41 crc kubenswrapper[4922]: E0929 10:03:41.042179 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1\": container with ID starting with 5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1 not found: ID does not exist" containerID="5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.042253 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1"} err="failed to get container status \"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1\": rpc error: code = NotFound desc = could not find container \"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1\": container with ID starting with 5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1 not found: ID does not exist" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.042296 4922 scope.go:117] "RemoveContainer" containerID="061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" Sep 29 10:03:41 crc kubenswrapper[4922]: E0929 10:03:41.042795 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286\": container with ID starting with 061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286 not found: ID does not exist" containerID="061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.042846 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286"} err="failed to get container status \"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286\": rpc error: code = NotFound desc = could not find container \"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286\": container with ID starting with 061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286 not found: ID does not exist" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.042881 4922 scope.go:117] "RemoveContainer" containerID="5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.044233 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.047804 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1"} err="failed to get container status \"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1\": rpc error: code = NotFound desc = could not find container \"5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1\": container with ID starting with 5370518d5318e7c9b5b20a1d3e374b84a33b4fc73ca689d39c0cea2c7d319bd1 not found: ID does not exist" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.047886 4922 scope.go:117] "RemoveContainer" containerID="061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.048921 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286"} err="failed to get container status \"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286\": rpc error: code = NotFound desc = could not find container \"061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286\": container with ID starting with 061e809212496540bec17c5f2e153b635a58710b71bb8a7cfea9591509b17286 not found: ID does not exist" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.059729 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.073618 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:41 crc kubenswrapper[4922]: E0929 10:03:41.074473 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-log" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.074498 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-log" Sep 29 10:03:41 crc kubenswrapper[4922]: E0929 10:03:41.074517 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-metadata" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.074525 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-metadata" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.074932 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-metadata" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.074966 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" containerName="nova-metadata-log" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.076355 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.081589 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.082657 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.082819 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.186192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.186241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.186268 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.186325 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qpz\" (UniqueName: \"kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.186351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.288280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qpz\" (UniqueName: \"kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.288383 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.288583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.288695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.288787 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.289020 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.298641 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.298702 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.304779 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.307722 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qpz\" (UniqueName: \"kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz\") pod \"nova-metadata-0\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.401460 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.469742 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cba84e2-e9c9-461b-86a9-57199d75496a" path="/var/lib/kubelet/pods/5cba84e2-e9c9-461b-86a9-57199d75496a/volumes" Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.891544 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:41 crc kubenswrapper[4922]: I0929 10:03:41.992884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerStarted","Data":"c52d00a0a2dbfe283548a143702d9d7142ac80747dfdb730a3c02dc357aff49a"} Sep 29 10:03:43 crc kubenswrapper[4922]: I0929 10:03:43.014005 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerStarted","Data":"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89"} Sep 29 10:03:43 crc kubenswrapper[4922]: I0929 10:03:43.014354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerStarted","Data":"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836"} Sep 29 10:03:43 crc kubenswrapper[4922]: I0929 10:03:43.050356 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.050317799 podStartE2EDuration="2.050317799s" podCreationTimestamp="2025-09-29 10:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:43.040925729 +0000 UTC m=+1148.407156013" watchObservedRunningTime="2025-09-29 10:03:43.050317799 +0000 UTC m=+1148.416548083" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.030202 4922 generic.go:334] "Generic (PLEG): container finished" podID="93689363-9408-4bc9-b502-0471871ff5ba" containerID="80d6d14c1fefb18655c259c0a92b49b4159e49ad90f7ce983642396bea4a7655" exitCode=0 Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.030969 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8fjc9" event={"ID":"93689363-9408-4bc9-b502-0471871ff5ba","Type":"ContainerDied","Data":"80d6d14c1fefb18655c259c0a92b49b4159e49ad90f7ce983642396bea4a7655"} Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.242791 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.562235 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.563077 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.574578 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.574651 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.626366 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.818167 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.929119 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:03:44 crc kubenswrapper[4922]: I0929 10:03:44.929445 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="dnsmasq-dns" containerID="cri-o://f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742" gracePeriod=10 Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.127270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.586726 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.595379 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.626802 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts\") pod \"93689363-9408-4bc9-b502-0471871ff5ba\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.626883 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s5wm\" (UniqueName: \"kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm\") pod \"93689363-9408-4bc9-b502-0471871ff5ba\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.626915 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle\") pod \"93689363-9408-4bc9-b502-0471871ff5ba\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.626943 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data\") pod \"93689363-9408-4bc9-b502-0471871ff5ba\" (UID: \"93689363-9408-4bc9-b502-0471871ff5ba\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.626972 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.627011 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsmm6\" (UniqueName: \"kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.627031 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.627130 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.627163 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.627223 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb\") pod \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\" (UID: \"bfa5df04-6c6c-4b1b-868c-47daf84b7da2\") " Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.647540 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.648745 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.649982 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm" (OuterVolumeSpecName: "kube-api-access-2s5wm") pod "93689363-9408-4bc9-b502-0471871ff5ba" (UID: "93689363-9408-4bc9-b502-0471871ff5ba"). InnerVolumeSpecName "kube-api-access-2s5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.683062 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6" (OuterVolumeSpecName: "kube-api-access-rsmm6") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "kube-api-access-rsmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.689147 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts" (OuterVolumeSpecName: "scripts") pod "93689363-9408-4bc9-b502-0471871ff5ba" (UID: "93689363-9408-4bc9-b502-0471871ff5ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.691098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.719766 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data" (OuterVolumeSpecName: "config-data") pod "93689363-9408-4bc9-b502-0471871ff5ba" (UID: "93689363-9408-4bc9-b502-0471871ff5ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.728819 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.728943 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s5wm\" (UniqueName: \"kubernetes.io/projected/93689363-9408-4bc9-b502-0471871ff5ba-kube-api-access-2s5wm\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.728959 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.728971 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.728983 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsmm6\" (UniqueName: \"kubernetes.io/projected/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-kube-api-access-rsmm6\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.745203 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93689363-9408-4bc9-b502-0471871ff5ba" (UID: "93689363-9408-4bc9-b502-0471871ff5ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.745570 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.745712 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config" (OuterVolumeSpecName: "config") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.754886 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.782250 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfa5df04-6c6c-4b1b-868c-47daf84b7da2" (UID: "bfa5df04-6c6c-4b1b-868c-47daf84b7da2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.834998 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.835040 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.835052 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.835060 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa5df04-6c6c-4b1b-868c-47daf84b7da2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:45 crc kubenswrapper[4922]: I0929 10:03:45.835072 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93689363-9408-4bc9-b502-0471871ff5ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.059039 4922 generic.go:334] "Generic (PLEG): container finished" podID="7807d04e-1d92-4727-9cad-6504967c92ad" containerID="b7be3d7d3a50aabc809cfb0b5f4b2241e494282d6fefd0ef2e6316ce4e222470" exitCode=0 Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.059165 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85szq" event={"ID":"7807d04e-1d92-4727-9cad-6504967c92ad","Type":"ContainerDied","Data":"b7be3d7d3a50aabc809cfb0b5f4b2241e494282d6fefd0ef2e6316ce4e222470"} Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.063201 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8fjc9" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.063188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8fjc9" event={"ID":"93689363-9408-4bc9-b502-0471871ff5ba","Type":"ContainerDied","Data":"f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d"} Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.063363 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b25e6e8e802439c0522bed23290ae2687ec860a05866a1a70aa4a85aa56f0d" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.066215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" event={"ID":"bfa5df04-6c6c-4b1b-868c-47daf84b7da2","Type":"ContainerDied","Data":"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742"} Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.066225 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.066175 4922 generic.go:334] "Generic (PLEG): container finished" podID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerID="f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742" exitCode=0 Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.066383 4922 scope.go:117] "RemoveContainer" containerID="f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.066699 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-4k9lh" event={"ID":"bfa5df04-6c6c-4b1b-868c-47daf84b7da2","Type":"ContainerDied","Data":"b5be229f38dfb21368713c09261beafbb1c688cafcc23d61523f54f659609ae2"} Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.176038 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.187844 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-4k9lh"] Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.190936 4922 scope.go:117] "RemoveContainer" containerID="f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.220894 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.221288 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-log" containerID="cri-o://6803a3b1183e897ee335a27051dfdedc2609e621ecc98e5c4b3bafb80715e374" gracePeriod=30 Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.221733 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-api" containerID="cri-o://f9718ee8b1877eac30dadb29e70510e4ee5ef946cbad4750a51da533083af25d" gracePeriod=30 Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.230298 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.230622 4922 scope.go:117] "RemoveContainer" containerID="f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742" Sep 29 10:03:46 crc kubenswrapper[4922]: E0929 10:03:46.240357 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742\": container with ID starting with f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742 not found: ID does not exist" containerID="f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.240425 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742"} err="failed to get container status \"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742\": rpc error: code = NotFound desc = could not find container \"f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742\": container with ID starting with f6f870b9ae153853a1da8d45c1f2db9761d01c915cf45ed4e004e1c3dc095742 not found: ID does not exist" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.240459 4922 scope.go:117] "RemoveContainer" containerID="f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.243095 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.243370 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-log" containerID="cri-o://00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" gracePeriod=30 Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.243874 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-metadata" containerID="cri-o://bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" gracePeriod=30 Sep 29 10:03:46 crc kubenswrapper[4922]: E0929 10:03:46.245205 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db\": container with ID starting with f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db not found: ID does not exist" containerID="f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.245294 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db"} err="failed to get container status \"f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db\": rpc error: code = NotFound desc = could not find container \"f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db\": container with ID starting with f7f2a29474227855bea5e6ed6fdc790a9a1ff9fee6c33fe59d4dcd5535b142db not found: ID does not exist" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.402121 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.402197 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.770141 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.965608 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qpz\" (UniqueName: \"kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz\") pod \"86c2b134-418e-4bca-97bd-c3c793ea349a\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.966699 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs\") pod \"86c2b134-418e-4bca-97bd-c3c793ea349a\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.966795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data\") pod \"86c2b134-418e-4bca-97bd-c3c793ea349a\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.967063 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs\") pod \"86c2b134-418e-4bca-97bd-c3c793ea349a\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.967198 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs" (OuterVolumeSpecName: "logs") pod "86c2b134-418e-4bca-97bd-c3c793ea349a" (UID: "86c2b134-418e-4bca-97bd-c3c793ea349a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.967570 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle\") pod \"86c2b134-418e-4bca-97bd-c3c793ea349a\" (UID: \"86c2b134-418e-4bca-97bd-c3c793ea349a\") " Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.968292 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c2b134-418e-4bca-97bd-c3c793ea349a-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:46 crc kubenswrapper[4922]: I0929 10:03:46.972667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz" (OuterVolumeSpecName: "kube-api-access-75qpz") pod "86c2b134-418e-4bca-97bd-c3c793ea349a" (UID: "86c2b134-418e-4bca-97bd-c3c793ea349a"). InnerVolumeSpecName "kube-api-access-75qpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.013022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data" (OuterVolumeSpecName: "config-data") pod "86c2b134-418e-4bca-97bd-c3c793ea349a" (UID: "86c2b134-418e-4bca-97bd-c3c793ea349a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.022113 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c2b134-418e-4bca-97bd-c3c793ea349a" (UID: "86c2b134-418e-4bca-97bd-c3c793ea349a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.049450 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "86c2b134-418e-4bca-97bd-c3c793ea349a" (UID: "86c2b134-418e-4bca-97bd-c3c793ea349a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.070255 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qpz\" (UniqueName: \"kubernetes.io/projected/86c2b134-418e-4bca-97bd-c3c793ea349a-kube-api-access-75qpz\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.070298 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.070308 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.070320 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c2b134-418e-4bca-97bd-c3c793ea349a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.096273 4922 generic.go:334] "Generic (PLEG): container finished" podID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerID="6803a3b1183e897ee335a27051dfdedc2609e621ecc98e5c4b3bafb80715e374" exitCode=143 Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.096390 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerDied","Data":"6803a3b1183e897ee335a27051dfdedc2609e621ecc98e5c4b3bafb80715e374"} Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099172 4922 generic.go:334] "Generic (PLEG): container finished" podID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerID="bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" exitCode=0 Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099233 4922 generic.go:334] "Generic (PLEG): container finished" podID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerID="00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" exitCode=143 Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerDied","Data":"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89"} Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099414 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerDied","Data":"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836"} Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099435 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"86c2b134-418e-4bca-97bd-c3c793ea349a","Type":"ContainerDied","Data":"c52d00a0a2dbfe283548a143702d9d7142ac80747dfdb730a3c02dc357aff49a"} Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099467 4922 scope.go:117] "RemoveContainer" containerID="bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.099642 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.103287 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerName="nova-scheduler-scheduler" containerID="cri-o://0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" gracePeriod=30 Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.128375 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.184481 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.194976 4922 scope.go:117] "RemoveContainer" containerID="00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.216991 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.228526 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.229258 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-log" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229288 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-log" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.229302 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93689363-9408-4bc9-b502-0471871ff5ba" containerName="nova-manage" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229312 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="93689363-9408-4bc9-b502-0471871ff5ba" containerName="nova-manage" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.229564 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-metadata" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229583 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-metadata" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.229607 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="dnsmasq-dns" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229615 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="dnsmasq-dns" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.229661 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="init" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229667 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="init" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229917 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-metadata" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229935 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" containerName="nova-metadata-log" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229945 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" containerName="dnsmasq-dns" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.229963 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="93689363-9408-4bc9-b502-0471871ff5ba" containerName="nova-manage" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.232392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.235892 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.235901 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.239607 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.255900 4922 scope.go:117] "RemoveContainer" containerID="bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.259046 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89\": container with ID starting with bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89 not found: ID does not exist" containerID="bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.259102 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89"} err="failed to get container status \"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89\": rpc error: code = NotFound desc = could not find container \"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89\": container with ID starting with bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89 not found: ID does not exist" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.259144 4922 scope.go:117] "RemoveContainer" containerID="00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" Sep 29 10:03:47 crc kubenswrapper[4922]: E0929 10:03:47.264006 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836\": container with ID starting with 00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836 not found: ID does not exist" containerID="00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.264090 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836"} err="failed to get container status \"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836\": rpc error: code = NotFound desc = could not find container \"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836\": container with ID starting with 00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836 not found: ID does not exist" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.264140 4922 scope.go:117] "RemoveContainer" containerID="bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.264554 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89"} err="failed to get container status \"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89\": rpc error: code = NotFound desc = could not find container \"bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89\": container with ID starting with bb90695ca7195af3ec715e7d7889d41009161fc505c463c304731867455ccd89 not found: ID does not exist" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.264636 4922 scope.go:117] "RemoveContainer" containerID="00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.272756 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836"} err="failed to get container status \"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836\": rpc error: code = NotFound desc = could not find container \"00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836\": container with ID starting with 00f72910374de7fd763ed7db703593575860cc35f69dd255d8f288c77bfd5836 not found: ID does not exist" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.413632 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s5mk\" (UniqueName: \"kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.413699 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.413784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.413872 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.413910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.466393 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c2b134-418e-4bca-97bd-c3c793ea349a" path="/var/lib/kubelet/pods/86c2b134-418e-4bca-97bd-c3c793ea349a/volumes" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.467599 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa5df04-6c6c-4b1b-868c-47daf84b7da2" path="/var/lib/kubelet/pods/bfa5df04-6c6c-4b1b-868c-47daf84b7da2/volumes" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.516096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s5mk\" (UniqueName: \"kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.516195 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.516269 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.516377 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.516436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.518074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.522547 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.525445 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.526655 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.546665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s5mk\" (UniqueName: \"kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk\") pod \"nova-metadata-0\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.558780 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.693052 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.821801 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data\") pod \"7807d04e-1d92-4727-9cad-6504967c92ad\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.822631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle\") pod \"7807d04e-1d92-4727-9cad-6504967c92ad\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.822713 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts\") pod \"7807d04e-1d92-4727-9cad-6504967c92ad\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.822849 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqmj\" (UniqueName: \"kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj\") pod \"7807d04e-1d92-4727-9cad-6504967c92ad\" (UID: \"7807d04e-1d92-4727-9cad-6504967c92ad\") " Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.830141 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts" (OuterVolumeSpecName: "scripts") pod "7807d04e-1d92-4727-9cad-6504967c92ad" (UID: "7807d04e-1d92-4727-9cad-6504967c92ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.830609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj" (OuterVolumeSpecName: "kube-api-access-tjqmj") pod "7807d04e-1d92-4727-9cad-6504967c92ad" (UID: "7807d04e-1d92-4727-9cad-6504967c92ad"). InnerVolumeSpecName "kube-api-access-tjqmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.855791 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data" (OuterVolumeSpecName: "config-data") pod "7807d04e-1d92-4727-9cad-6504967c92ad" (UID: "7807d04e-1d92-4727-9cad-6504967c92ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.879547 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7807d04e-1d92-4727-9cad-6504967c92ad" (UID: "7807d04e-1d92-4727-9cad-6504967c92ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.924957 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.925002 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.925017 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqmj\" (UniqueName: \"kubernetes.io/projected/7807d04e-1d92-4727-9cad-6504967c92ad-kube-api-access-tjqmj\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:47 crc kubenswrapper[4922]: I0929 10:03:47.925032 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7807d04e-1d92-4727-9cad-6504967c92ad-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.081732 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.122212 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerStarted","Data":"6ec632710354d3f25d1faabbc6578636c73a81b18fed5592b0c6568e86ba56d6"} Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.124040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-85szq" event={"ID":"7807d04e-1d92-4727-9cad-6504967c92ad","Type":"ContainerDied","Data":"e53080930c6bbad30e71d5d09cd7dbb6b96b3145c52c3bdf32a8b422c48395c5"} Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.124077 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e53080930c6bbad30e71d5d09cd7dbb6b96b3145c52c3bdf32a8b422c48395c5" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.124159 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-85szq" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.182765 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:03:48 crc kubenswrapper[4922]: E0929 10:03:48.183341 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7807d04e-1d92-4727-9cad-6504967c92ad" containerName="nova-cell1-conductor-db-sync" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.183362 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7807d04e-1d92-4727-9cad-6504967c92ad" containerName="nova-cell1-conductor-db-sync" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.183605 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7807d04e-1d92-4727-9cad-6504967c92ad" containerName="nova-cell1-conductor-db-sync" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.184614 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.187793 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.218056 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.336343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfzxx\" (UniqueName: \"kubernetes.io/projected/4d6968da-188d-461e-ab0f-00bf3e2fab0c-kube-api-access-tfzxx\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.336403 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.336437 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.438106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfzxx\" (UniqueName: \"kubernetes.io/projected/4d6968da-188d-461e-ab0f-00bf3e2fab0c-kube-api-access-tfzxx\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.438209 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.438243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.445676 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.445976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6968da-188d-461e-ab0f-00bf3e2fab0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.459283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfzxx\" (UniqueName: \"kubernetes.io/projected/4d6968da-188d-461e-ab0f-00bf3e2fab0c-kube-api-access-tfzxx\") pod \"nova-cell1-conductor-0\" (UID: \"4d6968da-188d-461e-ab0f-00bf3e2fab0c\") " pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:48 crc kubenswrapper[4922]: I0929 10:03:48.602892 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:49 crc kubenswrapper[4922]: I0929 10:03:49.091752 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 10:03:49 crc kubenswrapper[4922]: W0929 10:03:49.098727 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6968da_188d_461e_ab0f_00bf3e2fab0c.slice/crio-c7ee22ecef83a1d9f1ee54aacd1f11139e0f15021dce1dec0ff1d7605ef1c2c5 WatchSource:0}: Error finding container c7ee22ecef83a1d9f1ee54aacd1f11139e0f15021dce1dec0ff1d7605ef1c2c5: Status 404 returned error can't find the container with id c7ee22ecef83a1d9f1ee54aacd1f11139e0f15021dce1dec0ff1d7605ef1c2c5 Sep 29 10:03:49 crc kubenswrapper[4922]: I0929 10:03:49.141351 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4d6968da-188d-461e-ab0f-00bf3e2fab0c","Type":"ContainerStarted","Data":"c7ee22ecef83a1d9f1ee54aacd1f11139e0f15021dce1dec0ff1d7605ef1c2c5"} Sep 29 10:03:49 crc kubenswrapper[4922]: I0929 10:03:49.144677 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerStarted","Data":"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739"} Sep 29 10:03:49 crc kubenswrapper[4922]: I0929 10:03:49.144741 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerStarted","Data":"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977"} Sep 29 10:03:49 crc kubenswrapper[4922]: I0929 10:03:49.178103 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.178067004 podStartE2EDuration="2.178067004s" podCreationTimestamp="2025-09-29 10:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:49.173093083 +0000 UTC m=+1154.539323527" watchObservedRunningTime="2025-09-29 10:03:49.178067004 +0000 UTC m=+1154.544297348" Sep 29 10:03:49 crc kubenswrapper[4922]: E0929 10:03:49.579310 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:03:49 crc kubenswrapper[4922]: E0929 10:03:49.582194 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:03:49 crc kubenswrapper[4922]: E0929 10:03:49.584506 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:03:49 crc kubenswrapper[4922]: E0929 10:03:49.584554 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerName="nova-scheduler-scheduler" Sep 29 10:03:50 crc kubenswrapper[4922]: I0929 10:03:50.156364 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4d6968da-188d-461e-ab0f-00bf3e2fab0c","Type":"ContainerStarted","Data":"761b1e483ae4f3b289b56614111c06ac0e7539ed4031c155ff8c8f95e9026d65"} Sep 29 10:03:50 crc kubenswrapper[4922]: I0929 10:03:50.180452 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.180430185 podStartE2EDuration="2.180430185s" podCreationTimestamp="2025-09-29 10:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:50.173527072 +0000 UTC m=+1155.539757336" watchObservedRunningTime="2025-09-29 10:03:50.180430185 +0000 UTC m=+1155.546660439" Sep 29 10:03:51 crc kubenswrapper[4922]: I0929 10:03:51.170427 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:51 crc kubenswrapper[4922]: I0929 10:03:51.497988 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:51 crc kubenswrapper[4922]: I0929 10:03:51.498340 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="13dedc08-5eea-497d-a7f0-8509ea2000c0" containerName="kube-state-metrics" containerID="cri-o://b57f2910ae8de2ced077b6288e0736d8935e9d6277482b913d7cef55133f0655" gracePeriod=30 Sep 29 10:03:51 crc kubenswrapper[4922]: E0929 10:03:51.757272 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8a0865_0468_4195_a132_ba1fbc0b48a9.slice/crio-0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8a0865_0468_4195_a132_ba1fbc0b48a9.slice/crio-conmon-0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:03:51 crc kubenswrapper[4922]: I0929 10:03:51.932481 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.060186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle\") pod \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.060383 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9jw5\" (UniqueName: \"kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5\") pod \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.060573 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data\") pod \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\" (UID: \"eb8a0865-0468-4195-a132-ba1fbc0b48a9\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.069426 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5" (OuterVolumeSpecName: "kube-api-access-m9jw5") pod "eb8a0865-0468-4195-a132-ba1fbc0b48a9" (UID: "eb8a0865-0468-4195-a132-ba1fbc0b48a9"). InnerVolumeSpecName "kube-api-access-m9jw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.114247 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb8a0865-0468-4195-a132-ba1fbc0b48a9" (UID: "eb8a0865-0468-4195-a132-ba1fbc0b48a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.131027 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data" (OuterVolumeSpecName: "config-data") pod "eb8a0865-0468-4195-a132-ba1fbc0b48a9" (UID: "eb8a0865-0468-4195-a132-ba1fbc0b48a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.165603 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.165650 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9jw5\" (UniqueName: \"kubernetes.io/projected/eb8a0865-0468-4195-a132-ba1fbc0b48a9-kube-api-access-m9jw5\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.165662 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8a0865-0468-4195-a132-ba1fbc0b48a9-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.186253 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.187551 4922 generic.go:334] "Generic (PLEG): container finished" podID="13dedc08-5eea-497d-a7f0-8509ea2000c0" containerID="b57f2910ae8de2ced077b6288e0736d8935e9d6277482b913d7cef55133f0655" exitCode=2 Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.187660 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13dedc08-5eea-497d-a7f0-8509ea2000c0","Type":"ContainerDied","Data":"b57f2910ae8de2ced077b6288e0736d8935e9d6277482b913d7cef55133f0655"} Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.187725 4922 scope.go:117] "RemoveContainer" containerID="b57f2910ae8de2ced077b6288e0736d8935e9d6277482b913d7cef55133f0655" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.190762 4922 generic.go:334] "Generic (PLEG): container finished" podID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" exitCode=0 Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.190827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb8a0865-0468-4195-a132-ba1fbc0b48a9","Type":"ContainerDied","Data":"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8"} Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.190914 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb8a0865-0468-4195-a132-ba1fbc0b48a9","Type":"ContainerDied","Data":"91ac45de35d8065be0523cff1da4ae4682e837464126597b06220006857c5de6"} Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.190991 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.201643 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerDied","Data":"f9718ee8b1877eac30dadb29e70510e4ee5ef946cbad4750a51da533083af25d"} Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.201702 4922 generic.go:334] "Generic (PLEG): container finished" podID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerID="f9718ee8b1877eac30dadb29e70510e4ee5ef946cbad4750a51da533083af25d" exitCode=0 Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.201689 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.226895 4922 scope.go:117] "RemoveContainer" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.283078 4922 scope.go:117] "RemoveContainer" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" Sep 29 10:03:52 crc kubenswrapper[4922]: E0929 10:03:52.284087 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8\": container with ID starting with 0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8 not found: ID does not exist" containerID="0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.284135 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8"} err="failed to get container status \"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8\": rpc error: code = NotFound desc = could not find container \"0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8\": container with ID starting with 0eb5e76bc314955ba65a00625a925ca5e0132c4dbf7840b930eb092c0da3e7f8 not found: ID does not exist" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.285053 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.295955 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.306057 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:52 crc kubenswrapper[4922]: E0929 10:03:52.306691 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerName="nova-scheduler-scheduler" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.306710 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerName="nova-scheduler-scheduler" Sep 29 10:03:52 crc kubenswrapper[4922]: E0929 10:03:52.306741 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-api" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.306747 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-api" Sep 29 10:03:52 crc kubenswrapper[4922]: E0929 10:03:52.306767 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-log" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.306777 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-log" Sep 29 10:03:52 crc kubenswrapper[4922]: E0929 10:03:52.306793 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dedc08-5eea-497d-a7f0-8509ea2000c0" containerName="kube-state-metrics" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.306799 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dedc08-5eea-497d-a7f0-8509ea2000c0" containerName="kube-state-metrics" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.307017 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dedc08-5eea-497d-a7f0-8509ea2000c0" containerName="kube-state-metrics" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.307028 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" containerName="nova-scheduler-scheduler" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.307044 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-api" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.307059 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" containerName="nova-api-log" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.308098 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.310753 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.316122 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.369208 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qvz\" (UniqueName: \"kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz\") pod \"13dedc08-5eea-497d-a7f0-8509ea2000c0\" (UID: \"13dedc08-5eea-497d-a7f0-8509ea2000c0\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.369408 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thlcd\" (UniqueName: \"kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd\") pod \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.369496 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data\") pod \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.369937 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs\") pod \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.370000 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle\") pod \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\" (UID: \"1f0093a7-6fc6-4d3f-a415-2949e7df308d\") " Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.370584 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs" (OuterVolumeSpecName: "logs") pod "1f0093a7-6fc6-4d3f-a415-2949e7df308d" (UID: "1f0093a7-6fc6-4d3f-a415-2949e7df308d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.370881 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f0093a7-6fc6-4d3f-a415-2949e7df308d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.374266 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz" (OuterVolumeSpecName: "kube-api-access-26qvz") pod "13dedc08-5eea-497d-a7f0-8509ea2000c0" (UID: "13dedc08-5eea-497d-a7f0-8509ea2000c0"). InnerVolumeSpecName "kube-api-access-26qvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.375044 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd" (OuterVolumeSpecName: "kube-api-access-thlcd") pod "1f0093a7-6fc6-4d3f-a415-2949e7df308d" (UID: "1f0093a7-6fc6-4d3f-a415-2949e7df308d"). InnerVolumeSpecName "kube-api-access-thlcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.400106 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data" (OuterVolumeSpecName: "config-data") pod "1f0093a7-6fc6-4d3f-a415-2949e7df308d" (UID: "1f0093a7-6fc6-4d3f-a415-2949e7df308d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.400649 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f0093a7-6fc6-4d3f-a415-2949e7df308d" (UID: "1f0093a7-6fc6-4d3f-a415-2949e7df308d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.472721 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.472801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.473192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.473553 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thlcd\" (UniqueName: \"kubernetes.io/projected/1f0093a7-6fc6-4d3f-a415-2949e7df308d-kube-api-access-thlcd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.473577 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.473590 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0093a7-6fc6-4d3f-a415-2949e7df308d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.473601 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26qvz\" (UniqueName: \"kubernetes.io/projected/13dedc08-5eea-497d-a7f0-8509ea2000c0-kube-api-access-26qvz\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.560017 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.560109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.575698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.575926 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.575993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.581159 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.583784 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.600646 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " pod="openstack/nova-scheduler-0" Sep 29 10:03:52 crc kubenswrapper[4922]: I0929 10:03:52.628367 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.052890 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.244052 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.244014 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f0093a7-6fc6-4d3f-a415-2949e7df308d","Type":"ContainerDied","Data":"ff01ac51476f074bbfae4477fd60427cd685f8136e8375ef5f99136f16a2b11f"} Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.244144 4922 scope.go:117] "RemoveContainer" containerID="f9718ee8b1877eac30dadb29e70510e4ee5ef946cbad4750a51da533083af25d" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.254697 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d","Type":"ContainerStarted","Data":"8af545db079fafa43c42db2eecad91043af5d05959926d0d82353b64c3c3d455"} Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.260142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13dedc08-5eea-497d-a7f0-8509ea2000c0","Type":"ContainerDied","Data":"aea6de548e89658a44fcdd4f43ca950bc84d945f4a7e82a4c9106dc6f277f913"} Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.260364 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.277134 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.277100224 podStartE2EDuration="1.277100224s" podCreationTimestamp="2025-09-29 10:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:53.27360243 +0000 UTC m=+1158.639832694" watchObservedRunningTime="2025-09-29 10:03:53.277100224 +0000 UTC m=+1158.643330488" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.283497 4922 scope.go:117] "RemoveContainer" containerID="6803a3b1183e897ee335a27051dfdedc2609e621ecc98e5c4b3bafb80715e374" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.320915 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.335344 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.359140 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.376123 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.386635 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.389574 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.392972 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.408891 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.411817 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.414985 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.415297 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.420616 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.430972 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431581 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431639 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsc4\" (UniqueName: \"kubernetes.io/projected/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-api-access-lpsc4\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431689 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431741 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886wp\" (UniqueName: \"kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.431827 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.466184 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13dedc08-5eea-497d-a7f0-8509ea2000c0" path="/var/lib/kubelet/pods/13dedc08-5eea-497d-a7f0-8509ea2000c0/volumes" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.467051 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0093a7-6fc6-4d3f-a415-2949e7df308d" path="/var/lib/kubelet/pods/1f0093a7-6fc6-4d3f-a415-2949e7df308d/volumes" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.467716 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8a0865-0468-4195-a132-ba1fbc0b48a9" path="/var/lib/kubelet/pods/eb8a0865-0468-4195-a132-ba1fbc0b48a9/volumes" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsc4\" (UniqueName: \"kubernetes.io/projected/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-api-access-lpsc4\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533830 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-886wp\" (UniqueName: \"kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533904 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.533958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.535264 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.540251 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.540377 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.541329 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.543656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.544370 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8000729b-19d9-47cd-baa5-7ee4bab9cc04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.556918 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsc4\" (UniqueName: \"kubernetes.io/projected/8000729b-19d9-47cd-baa5-7ee4bab9cc04-kube-api-access-lpsc4\") pod \"kube-state-metrics-0\" (UID: \"8000729b-19d9-47cd-baa5-7ee4bab9cc04\") " pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.563938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-886wp\" (UniqueName: \"kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp\") pod \"nova-api-0\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.713245 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.743165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.971813 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.972604 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-central-agent" containerID="cri-o://0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923" gracePeriod=30 Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.972757 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-notification-agent" containerID="cri-o://9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6" gracePeriod=30 Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.972747 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="sg-core" containerID="cri-o://8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f" gracePeriod=30 Sep 29 10:03:53 crc kubenswrapper[4922]: I0929 10:03:53.972822 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="proxy-httpd" containerID="cri-o://be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f" gracePeriod=30 Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.287924 4922 generic.go:334] "Generic (PLEG): container finished" podID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerID="be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f" exitCode=0 Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.287975 4922 generic.go:334] "Generic (PLEG): container finished" podID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerID="8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f" exitCode=2 Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.287994 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerDied","Data":"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f"} Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.288121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerDied","Data":"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f"} Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.298245 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d","Type":"ContainerStarted","Data":"07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38"} Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.303915 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:03:54 crc kubenswrapper[4922]: W0929 10:03:54.308039 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fcbf4cf_3736_41e9_9f6f_75e60d3753a5.slice/crio-d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404 WatchSource:0}: Error finding container d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404: Status 404 returned error can't find the container with id d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404 Sep 29 10:03:54 crc kubenswrapper[4922]: I0929 10:03:54.383418 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 10:03:54 crc kubenswrapper[4922]: W0929 10:03:54.395552 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8000729b_19d9_47cd_baa5_7ee4bab9cc04.slice/crio-774c747b549cc64a4a849baa37875802f2ccfdea08bc61866af7967e4f79f547 WatchSource:0}: Error finding container 774c747b549cc64a4a849baa37875802f2ccfdea08bc61866af7967e4f79f547: Status 404 returned error can't find the container with id 774c747b549cc64a4a849baa37875802f2ccfdea08bc61866af7967e4f79f547 Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.315616 4922 generic.go:334] "Generic (PLEG): container finished" podID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerID="0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923" exitCode=0 Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.315724 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerDied","Data":"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.318736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8000729b-19d9-47cd-baa5-7ee4bab9cc04","Type":"ContainerStarted","Data":"932ec05b7ec272e90ee2ed1e1bf9fe62437dd4406e86016c513591b2df65d2c8"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.318766 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8000729b-19d9-47cd-baa5-7ee4bab9cc04","Type":"ContainerStarted","Data":"774c747b549cc64a4a849baa37875802f2ccfdea08bc61866af7967e4f79f547"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.319069 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.322300 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerStarted","Data":"cf366e5f7a000734a4eac4b1ed2cef2d2e1895363ff318bd4179a2051c5ef630"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.322333 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerStarted","Data":"9b96d8027206a5ca9d0c46914b95f16ada9699ceb7770928b1a075ecf4c76af6"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.322344 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerStarted","Data":"d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404"} Sep 29 10:03:55 crc kubenswrapper[4922]: I0929 10:03:55.353524 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.946293189 podStartE2EDuration="2.353489216s" podCreationTimestamp="2025-09-29 10:03:53 +0000 UTC" firstStartedPulling="2025-09-29 10:03:54.39990576 +0000 UTC m=+1159.766136024" lastFinishedPulling="2025-09-29 10:03:54.807101787 +0000 UTC m=+1160.173332051" observedRunningTime="2025-09-29 10:03:55.341462816 +0000 UTC m=+1160.707693160" watchObservedRunningTime="2025-09-29 10:03:55.353489216 +0000 UTC m=+1160.719719480" Sep 29 10:03:56 crc kubenswrapper[4922]: I0929 10:03:56.976579 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.013966 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.013933011 podStartE2EDuration="4.013933011s" podCreationTimestamp="2025-09-29 10:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:03:55.37061381 +0000 UTC m=+1160.736844074" watchObservedRunningTime="2025-09-29 10:03:57.013933011 +0000 UTC m=+1162.380163275" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157094 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157389 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157491 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpnw6\" (UniqueName: \"kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157548 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157756 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.157794 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml\") pod \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\" (UID: \"0a806f02-d7e2-4d51-b21c-cb63c0475e53\") " Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.161310 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.161608 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.171316 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts" (OuterVolumeSpecName: "scripts") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.171346 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6" (OuterVolumeSpecName: "kube-api-access-dpnw6") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "kube-api-access-dpnw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.194001 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.239501 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263094 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263201 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263219 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263250 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263264 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a806f02-d7e2-4d51-b21c-cb63c0475e53-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.263276 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpnw6\" (UniqueName: \"kubernetes.io/projected/0a806f02-d7e2-4d51-b21c-cb63c0475e53-kube-api-access-dpnw6\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.277778 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data" (OuterVolumeSpecName: "config-data") pod "0a806f02-d7e2-4d51-b21c-cb63c0475e53" (UID: "0a806f02-d7e2-4d51-b21c-cb63c0475e53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.347550 4922 generic.go:334] "Generic (PLEG): container finished" podID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerID="9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6" exitCode=0 Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.347771 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerDied","Data":"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6"} Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.348034 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.348376 4922 scope.go:117] "RemoveContainer" containerID="be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.348309 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a806f02-d7e2-4d51-b21c-cb63c0475e53","Type":"ContainerDied","Data":"5e378ddb3ab81c95aa9de229c4e89d71260f39b7ebc80c2719083bd52ba0bf02"} Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.365656 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a806f02-d7e2-4d51-b21c-cb63c0475e53-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.382195 4922 scope.go:117] "RemoveContainer" containerID="8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.396950 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.415787 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.419335 4922 scope.go:117] "RemoveContainer" containerID="9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.435302 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.436189 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-notification-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436220 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-notification-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.436233 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-central-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436243 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-central-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.436289 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="proxy-httpd" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436297 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="proxy-httpd" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.436319 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="sg-core" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436331 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="sg-core" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436579 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="proxy-httpd" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436620 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-notification-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436638 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="ceilometer-central-agent" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.436662 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" containerName="sg-core" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.444973 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.445822 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.448663 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.449374 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.449641 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.453494 4922 scope.go:117] "RemoveContainer" containerID="0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.472236 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.472322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.473420 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.473618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.473767 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.473957 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.474098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.474331 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfh2p\" (UniqueName: \"kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.477682 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a806f02-d7e2-4d51-b21c-cb63c0475e53" path="/var/lib/kubelet/pods/0a806f02-d7e2-4d51-b21c-cb63c0475e53/volumes" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.493538 4922 scope.go:117] "RemoveContainer" containerID="be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.496625 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f\": container with ID starting with be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f not found: ID does not exist" containerID="be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.498023 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f"} err="failed to get container status \"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f\": rpc error: code = NotFound desc = could not find container \"be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f\": container with ID starting with be0e781c4efe296110182e60c1a3c3d263ed78eee8e82b319d7f063ecdd5d65f not found: ID does not exist" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.498142 4922 scope.go:117] "RemoveContainer" containerID="8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.499000 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f\": container with ID starting with 8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f not found: ID does not exist" containerID="8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.499075 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f"} err="failed to get container status \"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f\": rpc error: code = NotFound desc = could not find container \"8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f\": container with ID starting with 8fa93f470f654df07e9025381dba4d99256c442d350eca557e997293cc7aa91f not found: ID does not exist" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.499120 4922 scope.go:117] "RemoveContainer" containerID="9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.499601 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6\": container with ID starting with 9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6 not found: ID does not exist" containerID="9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.499710 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6"} err="failed to get container status \"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6\": rpc error: code = NotFound desc = could not find container \"9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6\": container with ID starting with 9d5aca0afd5c46c551b5372dcedbc3ee515c5dbc6d1cb1485ff7dcf4c29920c6 not found: ID does not exist" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.499784 4922 scope.go:117] "RemoveContainer" containerID="0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923" Sep 29 10:03:57 crc kubenswrapper[4922]: E0929 10:03:57.501618 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923\": container with ID starting with 0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923 not found: ID does not exist" containerID="0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.501709 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923"} err="failed to get container status \"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923\": rpc error: code = NotFound desc = could not find container \"0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923\": container with ID starting with 0eb3d59333e4859cbab784b2cf443ec0eadd48f1a390fa910d90db459a4fc923 not found: ID does not exist" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.560265 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.560380 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.576601 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.576688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.576737 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.576767 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.576956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.577021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.577080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfh2p\" (UniqueName: \"kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.577107 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.577903 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.579554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.583498 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.584102 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.585197 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.586019 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.589594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.601521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfh2p\" (UniqueName: \"kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p\") pod \"ceilometer-0\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " pod="openstack/ceilometer-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.629393 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:03:57 crc kubenswrapper[4922]: I0929 10:03:57.773640 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:03:58 crc kubenswrapper[4922]: I0929 10:03:58.333175 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:03:58 crc kubenswrapper[4922]: I0929 10:03:58.360785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerStarted","Data":"28d090d8461700468c5d7d382381561a712a4f67ba91bec79fe9520470e116c3"} Sep 29 10:03:58 crc kubenswrapper[4922]: I0929 10:03:58.577006 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:03:58 crc kubenswrapper[4922]: I0929 10:03:58.577046 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:03:58 crc kubenswrapper[4922]: I0929 10:03:58.652630 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 29 10:03:59 crc kubenswrapper[4922]: I0929 10:03:59.377087 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerStarted","Data":"1d26e4774a8116df6ac60f7d947963b5c0475bbda1c3f9405a0958c9071f5a6d"} Sep 29 10:04:00 crc kubenswrapper[4922]: I0929 10:04:00.392138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerStarted","Data":"6b9dff4e0ab33a01c04c3251a0b1d82c57110013a8c51c36e9f194916149f7b4"} Sep 29 10:04:01 crc kubenswrapper[4922]: I0929 10:04:01.409134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerStarted","Data":"2fce937232a5ec3c3be9d0b923443582b402d176725c32a84db035f90dbeb61d"} Sep 29 10:04:02 crc kubenswrapper[4922]: I0929 10:04:02.629171 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:04:02 crc kubenswrapper[4922]: I0929 10:04:02.660372 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.441207 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerStarted","Data":"c6563560759ac2f5019c340d6dc85678c78e23c7184fa7f06d7a715bbddab958"} Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.484298 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.097513014 podStartE2EDuration="6.484271758s" podCreationTimestamp="2025-09-29 10:03:57 +0000 UTC" firstStartedPulling="2025-09-29 10:03:58.339120188 +0000 UTC m=+1163.705350452" lastFinishedPulling="2025-09-29 10:04:02.725878932 +0000 UTC m=+1168.092109196" observedRunningTime="2025-09-29 10:04:03.47869474 +0000 UTC m=+1168.844925004" watchObservedRunningTime="2025-09-29 10:04:03.484271758 +0000 UTC m=+1168.850502022" Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.497184 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.713443 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.713820 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:04:03 crc kubenswrapper[4922]: I0929 10:04:03.757254 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 10:04:04 crc kubenswrapper[4922]: I0929 10:04:04.452511 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:04:04 crc kubenswrapper[4922]: I0929 10:04:04.795125 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:04 crc kubenswrapper[4922]: I0929 10:04:04.795164 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:07 crc kubenswrapper[4922]: I0929 10:04:07.571921 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:04:07 crc kubenswrapper[4922]: I0929 10:04:07.572730 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:04:07 crc kubenswrapper[4922]: I0929 10:04:07.581231 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:04:07 crc kubenswrapper[4922]: I0929 10:04:07.584179 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.405637 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.485890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data\") pod \"6d2eb931-8920-4306-81db-b71e9162dbb3\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.485955 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle\") pod \"6d2eb931-8920-4306-81db-b71e9162dbb3\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.486141 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk9p7\" (UniqueName: \"kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7\") pod \"6d2eb931-8920-4306-81db-b71e9162dbb3\" (UID: \"6d2eb931-8920-4306-81db-b71e9162dbb3\") " Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.494063 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7" (OuterVolumeSpecName: "kube-api-access-zk9p7") pod "6d2eb931-8920-4306-81db-b71e9162dbb3" (UID: "6d2eb931-8920-4306-81db-b71e9162dbb3"). InnerVolumeSpecName "kube-api-access-zk9p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.520937 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d2eb931-8920-4306-81db-b71e9162dbb3" (UID: "6d2eb931-8920-4306-81db-b71e9162dbb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.535518 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data" (OuterVolumeSpecName: "config-data") pod "6d2eb931-8920-4306-81db-b71e9162dbb3" (UID: "6d2eb931-8920-4306-81db-b71e9162dbb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.543542 4922 generic.go:334] "Generic (PLEG): container finished" podID="6d2eb931-8920-4306-81db-b71e9162dbb3" containerID="a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d" exitCode=137 Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.543621 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d2eb931-8920-4306-81db-b71e9162dbb3","Type":"ContainerDied","Data":"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d"} Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.543663 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d2eb931-8920-4306-81db-b71e9162dbb3","Type":"ContainerDied","Data":"7354c52640e0cdfd6ba1053fd6029783cb98df77337a32069af175116fd7c405"} Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.543716 4922 scope.go:117] "RemoveContainer" containerID="a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.544201 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.588928 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.588980 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2eb931-8920-4306-81db-b71e9162dbb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.588993 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk9p7\" (UniqueName: \"kubernetes.io/projected/6d2eb931-8920-4306-81db-b71e9162dbb3-kube-api-access-zk9p7\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.633815 4922 scope.go:117] "RemoveContainer" containerID="a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d" Sep 29 10:04:10 crc kubenswrapper[4922]: E0929 10:04:10.634390 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d\": container with ID starting with a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d not found: ID does not exist" containerID="a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.634420 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d"} err="failed to get container status \"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d\": rpc error: code = NotFound desc = could not find container \"a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d\": container with ID starting with a5eff55b6197b6d982ace8015403b9328782c14ab532ece582f3ee36a8e5218d not found: ID does not exist" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.636344 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.648278 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.669760 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:04:10 crc kubenswrapper[4922]: E0929 10:04:10.670320 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2eb931-8920-4306-81db-b71e9162dbb3" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.670338 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2eb931-8920-4306-81db-b71e9162dbb3" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.670560 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2eb931-8920-4306-81db-b71e9162dbb3" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.671474 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.674922 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.675110 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.680361 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.692075 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.794816 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.795014 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.795190 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.795212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8v88\" (UniqueName: \"kubernetes.io/projected/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-kube-api-access-h8v88\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.795331 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.897657 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.898096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8v88\" (UniqueName: \"kubernetes.io/projected/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-kube-api-access-h8v88\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.898151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.898237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.898301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.903810 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.904218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.905512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.906157 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.921084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8v88\" (UniqueName: \"kubernetes.io/projected/c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528-kube-api-access-h8v88\") pod \"nova-cell1-novncproxy-0\" (UID: \"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:10 crc kubenswrapper[4922]: I0929 10:04:10.993475 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:11 crc kubenswrapper[4922]: I0929 10:04:11.463258 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2eb931-8920-4306-81db-b71e9162dbb3" path="/var/lib/kubelet/pods/6d2eb931-8920-4306-81db-b71e9162dbb3/volumes" Sep 29 10:04:11 crc kubenswrapper[4922]: I0929 10:04:11.525241 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 10:04:11 crc kubenswrapper[4922]: W0929 10:04:11.526257 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc69f22d7_5c9b_4f45_bb81_1a5ff8ac8528.slice/crio-f5226ad032a0e0b5b69ef9d140ab226757b89c9a5cbade65f5e4e257f525297b WatchSource:0}: Error finding container f5226ad032a0e0b5b69ef9d140ab226757b89c9a5cbade65f5e4e257f525297b: Status 404 returned error can't find the container with id f5226ad032a0e0b5b69ef9d140ab226757b89c9a5cbade65f5e4e257f525297b Sep 29 10:04:11 crc kubenswrapper[4922]: I0929 10:04:11.571024 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528","Type":"ContainerStarted","Data":"f5226ad032a0e0b5b69ef9d140ab226757b89c9a5cbade65f5e4e257f525297b"} Sep 29 10:04:12 crc kubenswrapper[4922]: I0929 10:04:12.588558 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528","Type":"ContainerStarted","Data":"7073579a80bf741a197a320906d4eb9b31c1b7a83aedbd273bcf1707354cf106"} Sep 29 10:04:12 crc kubenswrapper[4922]: I0929 10:04:12.621801 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.621778656 podStartE2EDuration="2.621778656s" podCreationTimestamp="2025-09-29 10:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:12.619981709 +0000 UTC m=+1177.986212013" watchObservedRunningTime="2025-09-29 10:04:12.621778656 +0000 UTC m=+1177.988008920" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.719624 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.719877 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.720497 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.720524 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.723270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.724850 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.957177 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.959196 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:13 crc kubenswrapper[4922]: I0929 10:04:13.980182 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066401 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4k2p\" (UniqueName: \"kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066494 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066519 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066550 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.066574 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.167809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4k2p\" (UniqueName: \"kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.167906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.167938 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.167965 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.167994 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.168014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.169288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.169470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.169641 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.169890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.169951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.201343 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4k2p\" (UniqueName: \"kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p\") pod \"dnsmasq-dns-59cf4bdb65-7qztw\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.284361 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:14 crc kubenswrapper[4922]: I0929 10:04:14.819271 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:04:15 crc kubenswrapper[4922]: I0929 10:04:15.639052 4922 generic.go:334] "Generic (PLEG): container finished" podID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerID="ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79" exitCode=0 Sep 29 10:04:15 crc kubenswrapper[4922]: I0929 10:04:15.639123 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" event={"ID":"4945ef36-899a-4e42-b95e-b5dfcca99783","Type":"ContainerDied","Data":"ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79"} Sep 29 10:04:15 crc kubenswrapper[4922]: I0929 10:04:15.639816 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" event={"ID":"4945ef36-899a-4e42-b95e-b5dfcca99783","Type":"ContainerStarted","Data":"7ee9f747d30eeee34cfd0ead6c93d73d647d8428f1b59dfa46ecf27ca5ce3528"} Sep 29 10:04:15 crc kubenswrapper[4922]: I0929 10:04:15.993823 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.072642 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.073023 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-central-agent" containerID="cri-o://1d26e4774a8116df6ac60f7d947963b5c0475bbda1c3f9405a0958c9071f5a6d" gracePeriod=30 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.073143 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="proxy-httpd" containerID="cri-o://c6563560759ac2f5019c340d6dc85678c78e23c7184fa7f06d7a715bbddab958" gracePeriod=30 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.073218 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="sg-core" containerID="cri-o://2fce937232a5ec3c3be9d0b923443582b402d176725c32a84db035f90dbeb61d" gracePeriod=30 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.073263 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-notification-agent" containerID="cri-o://6b9dff4e0ab33a01c04c3251a0b1d82c57110013a8c51c36e9f194916149f7b4" gracePeriod=30 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.097182 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.653033 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" event={"ID":"4945ef36-899a-4e42-b95e-b5dfcca99783","Type":"ContainerStarted","Data":"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5"} Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.653190 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658266 4922 generic.go:334] "Generic (PLEG): container finished" podID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerID="c6563560759ac2f5019c340d6dc85678c78e23c7184fa7f06d7a715bbddab958" exitCode=0 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658298 4922 generic.go:334] "Generic (PLEG): container finished" podID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerID="2fce937232a5ec3c3be9d0b923443582b402d176725c32a84db035f90dbeb61d" exitCode=2 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658307 4922 generic.go:334] "Generic (PLEG): container finished" podID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerID="1d26e4774a8116df6ac60f7d947963b5c0475bbda1c3f9405a0958c9071f5a6d" exitCode=0 Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658335 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerDied","Data":"c6563560759ac2f5019c340d6dc85678c78e23c7184fa7f06d7a715bbddab958"} Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658369 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerDied","Data":"2fce937232a5ec3c3be9d0b923443582b402d176725c32a84db035f90dbeb61d"} Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.658382 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerDied","Data":"1d26e4774a8116df6ac60f7d947963b5c0475bbda1c3f9405a0958c9071f5a6d"} Sep 29 10:04:16 crc kubenswrapper[4922]: I0929 10:04:16.694052 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" podStartSLOduration=3.694033224 podStartE2EDuration="3.694033224s" podCreationTimestamp="2025-09-29 10:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:16.686422112 +0000 UTC m=+1182.052652386" watchObservedRunningTime="2025-09-29 10:04:16.694033224 +0000 UTC m=+1182.060263488" Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.114016 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.114395 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-log" containerID="cri-o://9b96d8027206a5ca9d0c46914b95f16ada9699ceb7770928b1a075ecf4c76af6" gracePeriod=30 Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.114529 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-api" containerID="cri-o://cf366e5f7a000734a4eac4b1ed2cef2d2e1895363ff318bd4179a2051c5ef630" gracePeriod=30 Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.675521 4922 generic.go:334] "Generic (PLEG): container finished" podID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerID="9b96d8027206a5ca9d0c46914b95f16ada9699ceb7770928b1a075ecf4c76af6" exitCode=143 Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.675993 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerDied","Data":"9b96d8027206a5ca9d0c46914b95f16ada9699ceb7770928b1a075ecf4c76af6"} Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.678135 4922 generic.go:334] "Generic (PLEG): container finished" podID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerID="6b9dff4e0ab33a01c04c3251a0b1d82c57110013a8c51c36e9f194916149f7b4" exitCode=0 Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.679137 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerDied","Data":"6b9dff4e0ab33a01c04c3251a0b1d82c57110013a8c51c36e9f194916149f7b4"} Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.938456 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.960883 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.960965 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961010 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfh2p\" (UniqueName: \"kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961106 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961177 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961209 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.961345 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data\") pod \"01fa6de8-aca7-4347-a7a4-b54a521bb378\" (UID: \"01fa6de8-aca7-4347-a7a4-b54a521bb378\") " Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.962144 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.966403 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.994974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts" (OuterVolumeSpecName: "scripts") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:17 crc kubenswrapper[4922]: I0929 10:04:17.998520 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p" (OuterVolumeSpecName: "kube-api-access-tfh2p") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "kube-api-access-tfh2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.031211 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.044766 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064123 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064408 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064500 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064581 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064653 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01fa6de8-aca7-4347-a7a4-b54a521bb378-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.064729 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfh2p\" (UniqueName: \"kubernetes.io/projected/01fa6de8-aca7-4347-a7a4-b54a521bb378-kube-api-access-tfh2p\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.091850 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.104084 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data" (OuterVolumeSpecName: "config-data") pod "01fa6de8-aca7-4347-a7a4-b54a521bb378" (UID: "01fa6de8-aca7-4347-a7a4-b54a521bb378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.167101 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.167150 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01fa6de8-aca7-4347-a7a4-b54a521bb378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.704731 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01fa6de8-aca7-4347-a7a4-b54a521bb378","Type":"ContainerDied","Data":"28d090d8461700468c5d7d382381561a712a4f67ba91bec79fe9520470e116c3"} Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.705336 4922 scope.go:117] "RemoveContainer" containerID="c6563560759ac2f5019c340d6dc85678c78e23c7184fa7f06d7a715bbddab958" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.704871 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.749105 4922 scope.go:117] "RemoveContainer" containerID="2fce937232a5ec3c3be9d0b923443582b402d176725c32a84db035f90dbeb61d" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.768430 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.780899 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.787122 4922 scope.go:117] "RemoveContainer" containerID="6b9dff4e0ab33a01c04c3251a0b1d82c57110013a8c51c36e9f194916149f7b4" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.796877 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:18 crc kubenswrapper[4922]: E0929 10:04:18.797474 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-notification-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797500 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-notification-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: E0929 10:04:18.797522 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="sg-core" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797531 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="sg-core" Sep 29 10:04:18 crc kubenswrapper[4922]: E0929 10:04:18.797547 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-central-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797554 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-central-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: E0929 10:04:18.797585 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="proxy-httpd" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797592 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="proxy-httpd" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797806 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-central-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797823 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="sg-core" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797865 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="proxy-httpd" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.797883 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" containerName="ceilometer-notification-agent" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.799911 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.805125 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.805299 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.806036 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.824215 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.827820 4922 scope.go:117] "RemoveContainer" containerID="1d26e4774a8116df6ac60f7d947963b5c0475bbda1c3f9405a0958c9071f5a6d" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.884501 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-config-data\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.884889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885040 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-log-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885320 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-scripts\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885544 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrxt\" (UniqueName: \"kubernetes.io/projected/ede8dfdb-116f-4e02-8408-aea659020067-kube-api-access-gcrxt\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.885667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-run-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.989200 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-run-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.988561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-run-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.989447 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-config-data\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990255 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990419 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-log-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990448 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990505 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-scripts\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.990552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrxt\" (UniqueName: \"kubernetes.io/projected/ede8dfdb-116f-4e02-8408-aea659020067-kube-api-access-gcrxt\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.991259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ede8dfdb-116f-4e02-8408-aea659020067-log-httpd\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.994991 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.995638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:18 crc kubenswrapper[4922]: I0929 10:04:18.996147 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-scripts\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.003259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-config-data\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.011775 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede8dfdb-116f-4e02-8408-aea659020067-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.022168 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrxt\" (UniqueName: \"kubernetes.io/projected/ede8dfdb-116f-4e02-8408-aea659020067-kube-api-access-gcrxt\") pod \"ceilometer-0\" (UID: \"ede8dfdb-116f-4e02-8408-aea659020067\") " pod="openstack/ceilometer-0" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.121326 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.469336 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fa6de8-aca7-4347-a7a4-b54a521bb378" path="/var/lib/kubelet/pods/01fa6de8-aca7-4347-a7a4-b54a521bb378/volumes" Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.663798 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 10:04:19 crc kubenswrapper[4922]: W0929 10:04:19.677807 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podede8dfdb_116f_4e02_8408_aea659020067.slice/crio-6c9b420cadca15fee5bb26b081314f4d566502ff86646eb63cb388ec6b95c975 WatchSource:0}: Error finding container 6c9b420cadca15fee5bb26b081314f4d566502ff86646eb63cb388ec6b95c975: Status 404 returned error can't find the container with id 6c9b420cadca15fee5bb26b081314f4d566502ff86646eb63cb388ec6b95c975 Sep 29 10:04:19 crc kubenswrapper[4922]: I0929 10:04:19.718132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ede8dfdb-116f-4e02-8408-aea659020067","Type":"ContainerStarted","Data":"6c9b420cadca15fee5bb26b081314f4d566502ff86646eb63cb388ec6b95c975"} Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.737698 4922 generic.go:334] "Generic (PLEG): container finished" podID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerID="cf366e5f7a000734a4eac4b1ed2cef2d2e1895363ff318bd4179a2051c5ef630" exitCode=0 Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.737810 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerDied","Data":"cf366e5f7a000734a4eac4b1ed2cef2d2e1895363ff318bd4179a2051c5ef630"} Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.738334 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5","Type":"ContainerDied","Data":"d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404"} Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.738352 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1fc97391ae8889cfdd4a0ccd822376663de13bf7c7a15c4a8c843333d5a8404" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.741386 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ede8dfdb-116f-4e02-8408-aea659020067","Type":"ContainerStarted","Data":"bc737d1f1680da407d6d6278bd5bfe1e3abec7b38214a4f2640228dee1d84ca8"} Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.777659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.838009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle\") pod \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.838105 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs\") pod \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.838268 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-886wp\" (UniqueName: \"kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp\") pod \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.838296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data\") pod \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\" (UID: \"9fcbf4cf-3736-41e9-9f6f-75e60d3753a5\") " Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.839689 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs" (OuterVolumeSpecName: "logs") pod "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" (UID: "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.848724 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp" (OuterVolumeSpecName: "kube-api-access-886wp") pod "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" (UID: "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5"). InnerVolumeSpecName "kube-api-access-886wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.882847 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data" (OuterVolumeSpecName: "config-data") pod "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" (UID: "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.898015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" (UID: "9fcbf4cf-3736-41e9-9f6f-75e60d3753a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.943756 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.943820 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-886wp\" (UniqueName: \"kubernetes.io/projected/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-kube-api-access-886wp\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.943856 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.943869 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:20 crc kubenswrapper[4922]: I0929 10:04:20.995514 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.036709 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.755085 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ede8dfdb-116f-4e02-8408-aea659020067","Type":"ContainerStarted","Data":"b7e627e80805aac0beb47f8e3368846b801573377219e54e1c0925d75e544966"} Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.755126 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.780493 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.789557 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.801499 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.823590 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:21 crc kubenswrapper[4922]: E0929 10:04:21.824169 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-api" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.824191 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-api" Sep 29 10:04:21 crc kubenswrapper[4922]: E0929 10:04:21.824223 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-log" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.824233 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-log" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.824451 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-log" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.824475 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" containerName="nova-api-api" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.825642 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.829920 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.830508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.830658 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.863732 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874336 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874408 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874478 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874555 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.874578 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4tz\" (UniqueName: \"kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.977741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.977838 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.977885 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.977994 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.978030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th4tz\" (UniqueName: \"kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.978151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.979521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.984817 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.985310 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.987766 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-r6vhs"] Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.987952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.989706 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.998521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:21 crc kubenswrapper[4922]: I0929 10:04:21.998539 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r6vhs"] Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.041445 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.042561 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.069077 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4tz\" (UniqueName: \"kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz\") pod \"nova-api-0\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " pod="openstack/nova-api-0" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.080861 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.080922 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.080962 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.080999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdttp\" (UniqueName: \"kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.165032 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.197196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.197290 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.197346 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.197393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdttp\" (UniqueName: \"kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.203263 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.203873 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.213548 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.219003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdttp\" (UniqueName: \"kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp\") pod \"nova-cell1-cell-mapping-r6vhs\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.238578 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.770271 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.811326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ede8dfdb-116f-4e02-8408-aea659020067","Type":"ContainerStarted","Data":"9ec47e249ccb0f793f455ce5424f43deea25a4a4a22a6b6ca14eed8bc9b51d6c"} Sep 29 10:04:22 crc kubenswrapper[4922]: I0929 10:04:22.859522 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r6vhs"] Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.470036 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcbf4cf-3736-41e9-9f6f-75e60d3753a5" path="/var/lib/kubelet/pods/9fcbf4cf-3736-41e9-9f6f-75e60d3753a5/volumes" Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.825646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r6vhs" event={"ID":"9964cee5-67a1-4a42-84e3-3586ed6c3457","Type":"ContainerStarted","Data":"bd77e9f0fab3b7e75cd86e2e597efc8afbfdf6fdf0dbda2d42afdf414dd6bb5d"} Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.825708 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r6vhs" event={"ID":"9964cee5-67a1-4a42-84e3-3586ed6c3457","Type":"ContainerStarted","Data":"de6633686ba8f5fcc2110483f80551ddf110cc7b15e26b0c6b878e9e21bc08f4"} Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.828771 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerStarted","Data":"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303"} Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.828800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerStarted","Data":"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d"} Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.828812 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerStarted","Data":"6d108de9ecf920445b3f497152413621272a0836f1cdbd7b6b553945dd5f5cbd"} Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.854968 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-r6vhs" podStartSLOduration=2.854942468 podStartE2EDuration="2.854942468s" podCreationTimestamp="2025-09-29 10:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:23.845951069 +0000 UTC m=+1189.212181333" watchObservedRunningTime="2025-09-29 10:04:23.854942468 +0000 UTC m=+1189.221172732" Sep 29 10:04:23 crc kubenswrapper[4922]: I0929 10:04:23.893293 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.893264675 podStartE2EDuration="2.893264675s" podCreationTimestamp="2025-09-29 10:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:23.872277308 +0000 UTC m=+1189.238507582" watchObservedRunningTime="2025-09-29 10:04:23.893264675 +0000 UTC m=+1189.259494939" Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.286018 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.371781 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.372093 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="dnsmasq-dns" containerID="cri-o://e14fdb608d43bf1cb8d93b4c4b614ebc31df24a27e99b910ec49ce0d1252a32f" gracePeriod=10 Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.843760 4922 generic.go:334] "Generic (PLEG): container finished" podID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerID="e14fdb608d43bf1cb8d93b4c4b614ebc31df24a27e99b910ec49ce0d1252a32f" exitCode=0 Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.844345 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" event={"ID":"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9","Type":"ContainerDied","Data":"e14fdb608d43bf1cb8d93b4c4b614ebc31df24a27e99b910ec49ce0d1252a32f"} Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.856594 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ede8dfdb-116f-4e02-8408-aea659020067","Type":"ContainerStarted","Data":"eea772cf5bb52bbc0405f2d016b8da9fe5a992c3d8b74073993e739441003baa"} Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.857682 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.885189 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.982581263 podStartE2EDuration="6.885165058s" podCreationTimestamp="2025-09-29 10:04:18 +0000 UTC" firstStartedPulling="2025-09-29 10:04:19.680629832 +0000 UTC m=+1185.046860106" lastFinishedPulling="2025-09-29 10:04:23.583213637 +0000 UTC m=+1188.949443901" observedRunningTime="2025-09-29 10:04:24.879364194 +0000 UTC m=+1190.245594458" watchObservedRunningTime="2025-09-29 10:04:24.885165058 +0000 UTC m=+1190.251395322" Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.949723 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974258 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974399 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czxgk\" (UniqueName: \"kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974555 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974598 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:24 crc kubenswrapper[4922]: I0929 10:04:24.974678 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb\") pod \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\" (UID: \"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9\") " Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.012293 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk" (OuterVolumeSpecName: "kube-api-access-czxgk") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "kube-api-access-czxgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.077692 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czxgk\" (UniqueName: \"kubernetes.io/projected/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-kube-api-access-czxgk\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.088043 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.093414 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.105433 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.122429 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config" (OuterVolumeSpecName: "config") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.137300 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" (UID: "f0feb287-0f0c-4179-8f20-3a7d0ee00bd9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.178848 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.178890 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.178900 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.178910 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.178920 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.869453 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.870570 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" event={"ID":"f0feb287-0f0c-4179-8f20-3a7d0ee00bd9","Type":"ContainerDied","Data":"34d5a66ecd4a5252e847a91eab082ab696a0d5bc2669332b4e720df0df67ce3b"} Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.870632 4922 scope.go:117] "RemoveContainer" containerID="e14fdb608d43bf1cb8d93b4c4b614ebc31df24a27e99b910ec49ce0d1252a32f" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.902043 4922 scope.go:117] "RemoveContainer" containerID="b2404176b3c9df44dc034538dbe0f5b56947169132271ac6662b419cf382050b" Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.902204 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:04:25 crc kubenswrapper[4922]: I0929 10:04:25.913607 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-s9zj6"] Sep 29 10:04:27 crc kubenswrapper[4922]: I0929 10:04:27.465250 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" path="/var/lib/kubelet/pods/f0feb287-0f0c-4179-8f20-3a7d0ee00bd9/volumes" Sep 29 10:04:29 crc kubenswrapper[4922]: I0929 10:04:29.817251 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-s9zj6" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Sep 29 10:04:29 crc kubenswrapper[4922]: I0929 10:04:29.924484 4922 generic.go:334] "Generic (PLEG): container finished" podID="9964cee5-67a1-4a42-84e3-3586ed6c3457" containerID="bd77e9f0fab3b7e75cd86e2e597efc8afbfdf6fdf0dbda2d42afdf414dd6bb5d" exitCode=0 Sep 29 10:04:29 crc kubenswrapper[4922]: I0929 10:04:29.924554 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r6vhs" event={"ID":"9964cee5-67a1-4a42-84e3-3586ed6c3457","Type":"ContainerDied","Data":"bd77e9f0fab3b7e75cd86e2e597efc8afbfdf6fdf0dbda2d42afdf414dd6bb5d"} Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.417823 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.520830 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data\") pod \"9964cee5-67a1-4a42-84e3-3586ed6c3457\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.520997 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle\") pod \"9964cee5-67a1-4a42-84e3-3586ed6c3457\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.521055 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts\") pod \"9964cee5-67a1-4a42-84e3-3586ed6c3457\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.521423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdttp\" (UniqueName: \"kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp\") pod \"9964cee5-67a1-4a42-84e3-3586ed6c3457\" (UID: \"9964cee5-67a1-4a42-84e3-3586ed6c3457\") " Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.528685 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts" (OuterVolumeSpecName: "scripts") pod "9964cee5-67a1-4a42-84e3-3586ed6c3457" (UID: "9964cee5-67a1-4a42-84e3-3586ed6c3457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.529040 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp" (OuterVolumeSpecName: "kube-api-access-zdttp") pod "9964cee5-67a1-4a42-84e3-3586ed6c3457" (UID: "9964cee5-67a1-4a42-84e3-3586ed6c3457"). InnerVolumeSpecName "kube-api-access-zdttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.553516 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data" (OuterVolumeSpecName: "config-data") pod "9964cee5-67a1-4a42-84e3-3586ed6c3457" (UID: "9964cee5-67a1-4a42-84e3-3586ed6c3457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.557283 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9964cee5-67a1-4a42-84e3-3586ed6c3457" (UID: "9964cee5-67a1-4a42-84e3-3586ed6c3457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.625268 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.625334 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.625354 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9964cee5-67a1-4a42-84e3-3586ed6c3457-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.625393 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdttp\" (UniqueName: \"kubernetes.io/projected/9964cee5-67a1-4a42-84e3-3586ed6c3457-kube-api-access-zdttp\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.973453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r6vhs" event={"ID":"9964cee5-67a1-4a42-84e3-3586ed6c3457","Type":"ContainerDied","Data":"de6633686ba8f5fcc2110483f80551ddf110cc7b15e26b0c6b878e9e21bc08f4"} Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.973516 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r6vhs" Sep 29 10:04:31 crc kubenswrapper[4922]: I0929 10:04:31.973529 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6633686ba8f5fcc2110483f80551ddf110cc7b15e26b0c6b878e9e21bc08f4" Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.159165 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.159519 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-log" containerID="cri-o://0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" gracePeriod=30 Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.159615 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-api" containerID="cri-o://4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" gracePeriod=30 Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.178530 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.178889 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerName="nova-scheduler-scheduler" containerID="cri-o://07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" gracePeriod=30 Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.321013 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.321638 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" containerID="cri-o://ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977" gracePeriod=30 Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.321638 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" containerID="cri-o://e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739" gracePeriod=30 Sep 29 10:04:32 crc kubenswrapper[4922]: E0929 10:04:32.632820 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:04:32 crc kubenswrapper[4922]: E0929 10:04:32.641376 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:04:32 crc kubenswrapper[4922]: E0929 10:04:32.643129 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 10:04:32 crc kubenswrapper[4922]: E0929 10:04:32.643247 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerName="nova-scheduler-scheduler" Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.760700 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.961455 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.962079 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.962224 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th4tz\" (UniqueName: \"kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.962392 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.962530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.962745 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs\") pod \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\" (UID: \"7d0d4b4b-9b83-43ab-acbc-656f07072dc6\") " Sep 29 10:04:32 crc kubenswrapper[4922]: I0929 10:04:32.965619 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs" (OuterVolumeSpecName: "logs") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.022447 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz" (OuterVolumeSpecName: "kube-api-access-th4tz") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "kube-api-access-th4tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.032337 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerID="ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977" exitCode=143 Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.032453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerDied","Data":"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977"} Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.043730 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerID="4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" exitCode=0 Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.044275 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerID="0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" exitCode=143 Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.043982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerDied","Data":"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303"} Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.044338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerDied","Data":"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d"} Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.044361 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d0d4b4b-9b83-43ab-acbc-656f07072dc6","Type":"ContainerDied","Data":"6d108de9ecf920445b3f497152413621272a0836f1cdbd7b6b553945dd5f5cbd"} Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.044382 4922 scope.go:117] "RemoveContainer" containerID="4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.043930 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.066744 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.066791 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th4tz\" (UniqueName: \"kubernetes.io/projected/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-kube-api-access-th4tz\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.067501 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.087723 4922 scope.go:117] "RemoveContainer" containerID="0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.093727 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.137480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.142985 4922 scope.go:117] "RemoveContainer" containerID="4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.145122 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data" (OuterVolumeSpecName: "config-data") pod "7d0d4b4b-9b83-43ab-acbc-656f07072dc6" (UID: "7d0d4b4b-9b83-43ab-acbc-656f07072dc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.146993 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303\": container with ID starting with 4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303 not found: ID does not exist" containerID="4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.147043 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303"} err="failed to get container status \"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303\": rpc error: code = NotFound desc = could not find container \"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303\": container with ID starting with 4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303 not found: ID does not exist" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.147073 4922 scope.go:117] "RemoveContainer" containerID="0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.148671 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d\": container with ID starting with 0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d not found: ID does not exist" containerID="0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.148695 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d"} err="failed to get container status \"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d\": rpc error: code = NotFound desc = could not find container \"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d\": container with ID starting with 0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d not found: ID does not exist" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.148709 4922 scope.go:117] "RemoveContainer" containerID="4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.153277 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303"} err="failed to get container status \"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303\": rpc error: code = NotFound desc = could not find container \"4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303\": container with ID starting with 4f7527a23ff705080c3467c2fcaca81bd681a17d218b4bace0b4abe7358f5303 not found: ID does not exist" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.153311 4922 scope.go:117] "RemoveContainer" containerID="0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.154332 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d"} err="failed to get container status \"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d\": rpc error: code = NotFound desc = could not find container \"0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d\": container with ID starting with 0a14c9be43209100eadb84a95211528ea5908895181b05fff5f0672573700d2d not found: ID does not exist" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.182641 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.182689 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.182702 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.182713 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d0d4b4b-9b83-43ab-acbc-656f07072dc6-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.387734 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.397514 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.417620 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.418162 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="dnsmasq-dns" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418184 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="dnsmasq-dns" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.418212 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="init" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418221 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="init" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.418233 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-api" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418238 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-api" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.418253 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-log" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418259 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-log" Sep 29 10:04:33 crc kubenswrapper[4922]: E0929 10:04:33.418269 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9964cee5-67a1-4a42-84e3-3586ed6c3457" containerName="nova-manage" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418275 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9964cee5-67a1-4a42-84e3-3586ed6c3457" containerName="nova-manage" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418502 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-log" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418527 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0feb287-0f0c-4179-8f20-3a7d0ee00bd9" containerName="dnsmasq-dns" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418538 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9964cee5-67a1-4a42-84e3-3586ed6c3457" containerName="nova-manage" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.418552 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" containerName="nova-api-api" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.420124 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.423765 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.423863 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.425080 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.465689 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0d4b4b-9b83-43ab-acbc-656f07072dc6" path="/var/lib/kubelet/pods/7d0d4b4b-9b83-43ab-acbc-656f07072dc6/volumes" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.466586 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gnsp\" (UniqueName: \"kubernetes.io/projected/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-kube-api-access-9gnsp\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-config-data\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-logs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490874 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-public-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.490946 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.593890 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gnsp\" (UniqueName: \"kubernetes.io/projected/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-kube-api-access-9gnsp\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.593984 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-config-data\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.594056 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-logs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.594086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-public-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.594123 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.594152 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.595154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-logs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.600526 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.601695 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-public-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.604913 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-config-data\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.613148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.616062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gnsp\" (UniqueName: \"kubernetes.io/projected/18e1ff02-c0b1-4095-a40d-b9dc5a492de4-kube-api-access-9gnsp\") pod \"nova-api-0\" (UID: \"18e1ff02-c0b1-4095-a40d-b9dc5a492de4\") " pod="openstack/nova-api-0" Sep 29 10:04:33 crc kubenswrapper[4922]: I0929 10:04:33.803618 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 10:04:34 crc kubenswrapper[4922]: I0929 10:04:34.280948 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.075404 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18e1ff02-c0b1-4095-a40d-b9dc5a492de4","Type":"ContainerStarted","Data":"6ce27822b64365a98608170061eb00896d49e27f00313bb31c2d1651f8759bce"} Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.075462 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18e1ff02-c0b1-4095-a40d-b9dc5a492de4","Type":"ContainerStarted","Data":"e43d8726bb8ca9a66d8a37ffd8ba883f347dc95b2676d2afcb63bbde8eb4664f"} Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.075482 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"18e1ff02-c0b1-4095-a40d-b9dc5a492de4","Type":"ContainerStarted","Data":"d66376490eddfe5b9e35c5ede1c9129a2dff2a8d917be1f458efff699d29dd24"} Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.112592 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.112558858 podStartE2EDuration="2.112558858s" podCreationTimestamp="2025-09-29 10:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:35.096632445 +0000 UTC m=+1200.462862739" watchObservedRunningTime="2025-09-29 10:04:35.112558858 +0000 UTC m=+1200.478789132" Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.496155 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:49632->10.217.0.197:8775: read: connection reset by peer" Sep 29 10:04:35 crc kubenswrapper[4922]: I0929 10:04:35.496276 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:49630->10.217.0.197:8775: read: connection reset by peer" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.009253 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.050296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs\") pod \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.050489 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data\") pod \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.050534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s5mk\" (UniqueName: \"kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk\") pod \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.050643 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle\") pod \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.050721 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs\") pod \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\" (UID: \"f1ff74e4-dbc7-42b5-9f8c-07812498f738\") " Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.051752 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs" (OuterVolumeSpecName: "logs") pod "f1ff74e4-dbc7-42b5-9f8c-07812498f738" (UID: "f1ff74e4-dbc7-42b5-9f8c-07812498f738"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.069131 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk" (OuterVolumeSpecName: "kube-api-access-5s5mk") pod "f1ff74e4-dbc7-42b5-9f8c-07812498f738" (UID: "f1ff74e4-dbc7-42b5-9f8c-07812498f738"). InnerVolumeSpecName "kube-api-access-5s5mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.096043 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerID="e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739" exitCode=0 Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.097020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerDied","Data":"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739"} Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.097110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1ff74e4-dbc7-42b5-9f8c-07812498f738","Type":"ContainerDied","Data":"6ec632710354d3f25d1faabbc6578636c73a81b18fed5592b0c6568e86ba56d6"} Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.097105 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.097138 4922 scope.go:117] "RemoveContainer" containerID="e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.117740 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1ff74e4-dbc7-42b5-9f8c-07812498f738" (UID: "f1ff74e4-dbc7-42b5-9f8c-07812498f738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.121615 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data" (OuterVolumeSpecName: "config-data") pod "f1ff74e4-dbc7-42b5-9f8c-07812498f738" (UID: "f1ff74e4-dbc7-42b5-9f8c-07812498f738"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.142484 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f1ff74e4-dbc7-42b5-9f8c-07812498f738" (UID: "f1ff74e4-dbc7-42b5-9f8c-07812498f738"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.153972 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.154018 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s5mk\" (UniqueName: \"kubernetes.io/projected/f1ff74e4-dbc7-42b5-9f8c-07812498f738-kube-api-access-5s5mk\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.154037 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.154050 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1ff74e4-dbc7-42b5-9f8c-07812498f738-logs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.154062 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1ff74e4-dbc7-42b5-9f8c-07812498f738-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.205350 4922 scope.go:117] "RemoveContainer" containerID="ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.232467 4922 scope.go:117] "RemoveContainer" containerID="e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739" Sep 29 10:04:36 crc kubenswrapper[4922]: E0929 10:04:36.232879 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739\": container with ID starting with e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739 not found: ID does not exist" containerID="e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.232931 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739"} err="failed to get container status \"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739\": rpc error: code = NotFound desc = could not find container \"e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739\": container with ID starting with e0129207a2658cd6feb45b92b7421565311de2aec16baac7d9e207f209d93739 not found: ID does not exist" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.232970 4922 scope.go:117] "RemoveContainer" containerID="ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977" Sep 29 10:04:36 crc kubenswrapper[4922]: E0929 10:04:36.233284 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977\": container with ID starting with ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977 not found: ID does not exist" containerID="ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.233364 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977"} err="failed to get container status \"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977\": rpc error: code = NotFound desc = could not find container \"ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977\": container with ID starting with ee76d08e9ebcf1ec2e0bf95e79704e2c5d6c6403788a1cd82445f95a866b8977 not found: ID does not exist" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.451102 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.467823 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.523097 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:36 crc kubenswrapper[4922]: E0929 10:04:36.541038 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.541098 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" Sep 29 10:04:36 crc kubenswrapper[4922]: E0929 10:04:36.541117 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.541125 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.541958 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-log" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.542006 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" containerName="nova-metadata-metadata" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.543440 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.549480 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.554012 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.554947 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.563678 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-config-data\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.563843 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.563885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e84e53c-d007-4780-be8e-1794d0c7b88f-logs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.563916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.563964 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbwr7\" (UniqueName: \"kubernetes.io/projected/6e84e53c-d007-4780-be8e-1794d0c7b88f-kube-api-access-mbwr7\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.665635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbwr7\" (UniqueName: \"kubernetes.io/projected/6e84e53c-d007-4780-be8e-1794d0c7b88f-kube-api-access-mbwr7\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.665725 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-config-data\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.665866 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.665917 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e84e53c-d007-4780-be8e-1794d0c7b88f-logs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.665953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.667096 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e84e53c-d007-4780-be8e-1794d0c7b88f-logs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.672699 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.672960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-config-data\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.688977 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e84e53c-d007-4780-be8e-1794d0c7b88f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.689497 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbwr7\" (UniqueName: \"kubernetes.io/projected/6e84e53c-d007-4780-be8e-1794d0c7b88f-kube-api-access-mbwr7\") pod \"nova-metadata-0\" (UID: \"6e84e53c-d007-4780-be8e-1794d0c7b88f\") " pod="openstack/nova-metadata-0" Sep 29 10:04:36 crc kubenswrapper[4922]: I0929 10:04:36.919119 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.113915 4922 generic.go:334] "Generic (PLEG): container finished" podID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerID="07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" exitCode=0 Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.114076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d","Type":"ContainerDied","Data":"07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38"} Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.278310 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.381443 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data\") pod \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.381538 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr\") pod \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.381578 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle\") pod \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\" (UID: \"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d\") " Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.386993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr" (OuterVolumeSpecName: "kube-api-access-tnmdr") pod "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" (UID: "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d"). InnerVolumeSpecName "kube-api-access-tnmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.410495 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data" (OuterVolumeSpecName: "config-data") pod "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" (UID: "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.418934 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" (UID: "f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.474967 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ff74e4-dbc7-42b5-9f8c-07812498f738" path="/var/lib/kubelet/pods/f1ff74e4-dbc7-42b5-9f8c-07812498f738/volumes" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.485426 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.485487 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-kube-api-access-tnmdr\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.485506 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:04:37 crc kubenswrapper[4922]: I0929 10:04:37.515690 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 10:04:37 crc kubenswrapper[4922]: W0929 10:04:37.526598 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e84e53c_d007_4780_be8e_1794d0c7b88f.slice/crio-2a87609ba5a2bc5e226e35c701282cdf8188af6c74a433d6e5339c35b7fdedef WatchSource:0}: Error finding container 2a87609ba5a2bc5e226e35c701282cdf8188af6c74a433d6e5339c35b7fdedef: Status 404 returned error can't find the container with id 2a87609ba5a2bc5e226e35c701282cdf8188af6c74a433d6e5339c35b7fdedef Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.131648 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d","Type":"ContainerDied","Data":"8af545db079fafa43c42db2eecad91043af5d05959926d0d82353b64c3c3d455"} Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.131706 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.132157 4922 scope.go:117] "RemoveContainer" containerID="07187f390913ac5f0ec7e7978704ae237933afa2f1288d15d11c60f79d163c38" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.136154 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e84e53c-d007-4780-be8e-1794d0c7b88f","Type":"ContainerStarted","Data":"841a67f01625d62aa572406496bf7077cf5a6482fa14e4de185f6e140dac5ea9"} Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.136183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e84e53c-d007-4780-be8e-1794d0c7b88f","Type":"ContainerStarted","Data":"54aeecacf27db330dfbf18f3a048e1d12ebfa3b7379f459d9bd095ad2b5f04ff"} Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.136192 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e84e53c-d007-4780-be8e-1794d0c7b88f","Type":"ContainerStarted","Data":"2a87609ba5a2bc5e226e35c701282cdf8188af6c74a433d6e5339c35b7fdedef"} Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.169305 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.169279945 podStartE2EDuration="2.169279945s" podCreationTimestamp="2025-09-29 10:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:38.166913643 +0000 UTC m=+1203.533143937" watchObservedRunningTime="2025-09-29 10:04:38.169279945 +0000 UTC m=+1203.535510209" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.219049 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.229787 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.241301 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:38 crc kubenswrapper[4922]: E0929 10:04:38.244000 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerName="nova-scheduler-scheduler" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.244052 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerName="nova-scheduler-scheduler" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.244328 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" containerName="nova-scheduler-scheduler" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.245840 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.249623 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.265500 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.304618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.304686 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-config-data\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.304762 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vcb\" (UniqueName: \"kubernetes.io/projected/ee836f5f-3a1b-4c14-9234-711246af0b41-kube-api-access-p5vcb\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.409868 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.409948 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-config-data\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.410039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vcb\" (UniqueName: \"kubernetes.io/projected/ee836f5f-3a1b-4c14-9234-711246af0b41-kube-api-access-p5vcb\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.415511 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-config-data\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.415936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee836f5f-3a1b-4c14-9234-711246af0b41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.433635 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vcb\" (UniqueName: \"kubernetes.io/projected/ee836f5f-3a1b-4c14-9234-711246af0b41-kube-api-access-p5vcb\") pod \"nova-scheduler-0\" (UID: \"ee836f5f-3a1b-4c14-9234-711246af0b41\") " pod="openstack/nova-scheduler-0" Sep 29 10:04:38 crc kubenswrapper[4922]: I0929 10:04:38.582737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 10:04:39 crc kubenswrapper[4922]: I0929 10:04:39.052780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 10:04:39 crc kubenswrapper[4922]: I0929 10:04:39.153200 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee836f5f-3a1b-4c14-9234-711246af0b41","Type":"ContainerStarted","Data":"f58c70f27df3a6631836e199eb1ba4a5dedec1c044d90da0713acea7ea7a2035"} Sep 29 10:04:39 crc kubenswrapper[4922]: I0929 10:04:39.470772 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d" path="/var/lib/kubelet/pods/f8a9f1bb-bbef-4361-99e6-e9fc027b6f2d/volumes" Sep 29 10:04:40 crc kubenswrapper[4922]: I0929 10:04:40.173791 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ee836f5f-3a1b-4c14-9234-711246af0b41","Type":"ContainerStarted","Data":"eec4b1e87ddc439c22d978e056821b22d4387e27242f37d97228b57f1928642a"} Sep 29 10:04:40 crc kubenswrapper[4922]: I0929 10:04:40.210540 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.210504585 podStartE2EDuration="2.210504585s" podCreationTimestamp="2025-09-29 10:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:04:40.196820801 +0000 UTC m=+1205.563051085" watchObservedRunningTime="2025-09-29 10:04:40.210504585 +0000 UTC m=+1205.576734869" Sep 29 10:04:41 crc kubenswrapper[4922]: I0929 10:04:41.919967 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:04:41 crc kubenswrapper[4922]: I0929 10:04:41.920442 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 10:04:43 crc kubenswrapper[4922]: I0929 10:04:43.583189 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 10:04:43 crc kubenswrapper[4922]: I0929 10:04:43.804701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:04:43 crc kubenswrapper[4922]: I0929 10:04:43.804892 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 10:04:44 crc kubenswrapper[4922]: I0929 10:04:44.823168 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="18e1ff02-c0b1-4095-a40d-b9dc5a492de4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:44 crc kubenswrapper[4922]: I0929 10:04:44.823285 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="18e1ff02-c0b1-4095-a40d-b9dc5a492de4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:46 crc kubenswrapper[4922]: I0929 10:04:46.920221 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:04:46 crc kubenswrapper[4922]: I0929 10:04:46.920892 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 10:04:47 crc kubenswrapper[4922]: I0929 10:04:47.935047 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6e84e53c-d007-4780-be8e-1794d0c7b88f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:47 crc kubenswrapper[4922]: I0929 10:04:47.935339 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6e84e53c-d007-4780-be8e-1794d0c7b88f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 10:04:48 crc kubenswrapper[4922]: I0929 10:04:48.583721 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 10:04:48 crc kubenswrapper[4922]: I0929 10:04:48.648652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 10:04:49 crc kubenswrapper[4922]: I0929 10:04:49.130754 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 10:04:49 crc kubenswrapper[4922]: I0929 10:04:49.315992 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.816238 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.817196 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.817572 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.817599 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.824493 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:04:53 crc kubenswrapper[4922]: I0929 10:04:53.824560 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 10:04:56 crc kubenswrapper[4922]: I0929 10:04:56.926743 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:04:56 crc kubenswrapper[4922]: I0929 10:04:56.937342 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:04:56 crc kubenswrapper[4922]: I0929 10:04:56.937660 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 10:04:57 crc kubenswrapper[4922]: I0929 10:04:57.411285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 10:05:05 crc kubenswrapper[4922]: I0929 10:05:05.406688 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:06 crc kubenswrapper[4922]: I0929 10:05:06.431324 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:10 crc kubenswrapper[4922]: I0929 10:05:10.981620 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="rabbitmq" containerID="cri-o://7df93ad0c009d4519aa6464d01fcf6e1224050a5ded4664b07d5a04cd1aad245" gracePeriod=604795 Sep 29 10:05:11 crc kubenswrapper[4922]: I0929 10:05:11.504416 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="rabbitmq" containerID="cri-o://315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549" gracePeriod=604795 Sep 29 10:05:15 crc kubenswrapper[4922]: I0929 10:05:15.778070 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Sep 29 10:05:16 crc kubenswrapper[4922]: I0929 10:05:16.144205 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.674234 4922 generic.go:334] "Generic (PLEG): container finished" podID="3a51d044-d162-4938-8ca4-b4a200e78739" containerID="7df93ad0c009d4519aa6464d01fcf6e1224050a5ded4664b07d5a04cd1aad245" exitCode=0 Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.674349 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerDied","Data":"7df93ad0c009d4519aa6464d01fcf6e1224050a5ded4664b07d5a04cd1aad245"} Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.849691 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.982708 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.982843 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.983800 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.983898 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.983982 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984085 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984124 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lghhr\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984199 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984236 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.984334 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins\") pod \"3a51d044-d162-4938-8ca4-b4a200e78739\" (UID: \"3a51d044-d162-4938-8ca4-b4a200e78739\") " Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.983689 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.985587 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.988541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.989215 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.992385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:17 crc kubenswrapper[4922]: I0929 10:05:17.999871 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr" (OuterVolumeSpecName: "kube-api-access-lghhr") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "kube-api-access-lghhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.000071 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info" (OuterVolumeSpecName: "pod-info") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.000275 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.011239 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.049390 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data" (OuterVolumeSpecName: "config-data") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.085725 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf" (OuterVolumeSpecName: "server-conf") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088069 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088112 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088125 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a51d044-d162-4938-8ca4-b4a200e78739-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088137 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lghhr\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-kube-api-access-lghhr\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088163 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088175 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a51d044-d162-4938-8ca4-b4a200e78739-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088187 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088197 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a51d044-d162-4938-8ca4-b4a200e78739-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.088206 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.115402 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.190324 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.224227 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3a51d044-d162-4938-8ca4-b4a200e78739" (UID: "3a51d044-d162-4938-8ca4-b4a200e78739"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.293086 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a51d044-d162-4938-8ca4-b4a200e78739-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.326198 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.496787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.496857 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.496899 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.496954 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.496997 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497042 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497087 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7ndd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497181 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497216 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.497236 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins\") pod \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\" (UID: \"e2ad8ac2-2191-43ab-9979-9ccbe368d883\") " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.498033 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.498719 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.500258 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.510012 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.511859 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.512217 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.528754 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd" (OuterVolumeSpecName: "kube-api-access-v7ndd") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "kube-api-access-v7ndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.530327 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info" (OuterVolumeSpecName: "pod-info") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.553568 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data" (OuterVolumeSpecName: "config-data") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.586457 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf" (OuterVolumeSpecName: "server-conf") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599860 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2ad8ac2-2191-43ab-9979-9ccbe368d883-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599897 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599908 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599917 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599930 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599938 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7ndd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-kube-api-access-v7ndd\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599947 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2ad8ac2-2191-43ab-9979-9ccbe368d883-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599957 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2ad8ac2-2191-43ab-9979-9ccbe368d883-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599966 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.599987 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.623008 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.654873 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e2ad8ac2-2191-43ab-9979-9ccbe368d883" (UID: "e2ad8ac2-2191-43ab-9979-9ccbe368d883"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.686966 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a51d044-d162-4938-8ca4-b4a200e78739","Type":"ContainerDied","Data":"542853881ba52660c1569c79c55c4d7f667bc023ff9e6400c3e1c1aae94d373e"} Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.687066 4922 scope.go:117] "RemoveContainer" containerID="7df93ad0c009d4519aa6464d01fcf6e1224050a5ded4664b07d5a04cd1aad245" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.687396 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.693229 4922 generic.go:334] "Generic (PLEG): container finished" podID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerID="315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549" exitCode=0 Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.693278 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerDied","Data":"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549"} Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.693313 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2ad8ac2-2191-43ab-9979-9ccbe368d883","Type":"ContainerDied","Data":"a158d2ab863a555aa9cbc884dea642a860a9d3ed49f2aea712cac2338e5101fd"} Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.693342 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.702073 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2ad8ac2-2191-43ab-9979-9ccbe368d883-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.702103 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.720133 4922 scope.go:117] "RemoveContainer" containerID="faadddd4d5d9c294d7d0d82cbdccb37186b92c157c6a6cbb4ca84753ab65f49a" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.752374 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.763376 4922 scope.go:117] "RemoveContainer" containerID="315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.764439 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.775850 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.785502 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.797686 4922 scope.go:117] "RemoveContainer" containerID="c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.800687 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.801266 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="setup-container" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801282 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="setup-container" Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.801296 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="setup-container" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801302 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="setup-container" Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.801315 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801321 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.801335 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801341 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801514 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.801528 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" containerName="rabbitmq" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.802697 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.805562 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p72nv" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813024 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813098 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813268 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813302 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813396 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813647 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.813799 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.816609 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.817337 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.821207 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.825377 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.826311 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.826364 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.826513 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.826556 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.826934 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.827083 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s9fwb" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.869250 4922 scope.go:117] "RemoveContainer" containerID="315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549" Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.871707 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549\": container with ID starting with 315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549 not found: ID does not exist" containerID="315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.871740 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549"} err="failed to get container status \"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549\": rpc error: code = NotFound desc = could not find container \"315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549\": container with ID starting with 315afd9ae24faabfd3a3c82a37b8ae4142b01ca71ecbc5da151dce404c516549 not found: ID does not exist" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.871766 4922 scope.go:117] "RemoveContainer" containerID="c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0" Sep 29 10:05:18 crc kubenswrapper[4922]: E0929 10:05:18.872406 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0\": container with ID starting with c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0 not found: ID does not exist" containerID="c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.872436 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0"} err="failed to get container status \"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0\": rpc error: code = NotFound desc = could not find container \"c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0\": container with ID starting with c962baffa065e3e1feda60217b4826ec6915d127f97466c9186d548667cd6cd0 not found: ID does not exist" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916726 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916852 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d2c1fe4-f762-40fa-8439-f74d3e234d30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916909 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktmk\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-kube-api-access-vktmk\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916963 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.916994 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.917094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d2c1fe4-f762-40fa-8439-f74d3e234d30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.917120 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:18 crc kubenswrapper[4922]: I0929 10:05:18.917231 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.019730 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b45f77-ae02-47df-b1ab-5137f6e23089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.019816 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.019885 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.019961 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d2c1fe4-f762-40fa-8439-f74d3e234d30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.019995 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktmk\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-kube-api-access-vktmk\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020060 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020131 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b45f77-ae02-47df-b1ab-5137f6e23089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020240 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020309 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020363 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020402 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d2c1fe4-f762-40fa-8439-f74d3e234d30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020460 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx7rz\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-kube-api-access-wx7rz\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020609 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020649 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020717 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020776 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020859 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020889 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.020916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.021364 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.021525 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.021580 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.021849 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.021956 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.022670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.022786 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d2c1fe4-f762-40fa-8439-f74d3e234d30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.026671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.026671 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.027806 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d2c1fe4-f762-40fa-8439-f74d3e234d30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.028502 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d2c1fe4-f762-40fa-8439-f74d3e234d30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.043183 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktmk\" (UniqueName: \"kubernetes.io/projected/5d2c1fe4-f762-40fa-8439-f74d3e234d30-kube-api-access-vktmk\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.077801 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"5d2c1fe4-f762-40fa-8439-f74d3e234d30\") " pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123451 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123479 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123510 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123557 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b45f77-ae02-47df-b1ab-5137f6e23089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123634 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b45f77-ae02-47df-b1ab-5137f6e23089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx7rz\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-kube-api-access-wx7rz\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.123724 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.124178 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.124922 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.125375 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.126720 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.126968 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.127116 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82b45f77-ae02-47df-b1ab-5137f6e23089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.128049 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82b45f77-ae02-47df-b1ab-5137f6e23089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.132436 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82b45f77-ae02-47df-b1ab-5137f6e23089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.132730 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.141591 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.141750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.147975 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx7rz\" (UniqueName: \"kubernetes.io/projected/82b45f77-ae02-47df-b1ab-5137f6e23089-kube-api-access-wx7rz\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.164278 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"82b45f77-ae02-47df-b1ab-5137f6e23089\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.451520 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.487911 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a51d044-d162-4938-8ca4-b4a200e78739" path="/var/lib/kubelet/pods/3a51d044-d162-4938-8ca4-b4a200e78739/volumes" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.489154 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ad8ac2-2191-43ab-9979-9ccbe368d883" path="/var/lib/kubelet/pods/e2ad8ac2-2191-43ab-9979-9ccbe368d883/volumes" Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.607478 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.709296 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d2c1fe4-f762-40fa-8439-f74d3e234d30","Type":"ContainerStarted","Data":"a58728d803fb983344a06cb69c627d0b0bafdc4b514e443a8fe9e299c8ffdf7f"} Sep 29 10:05:19 crc kubenswrapper[4922]: I0929 10:05:19.936185 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 10:05:19 crc kubenswrapper[4922]: W0929 10:05:19.957194 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82b45f77_ae02_47df_b1ab_5137f6e23089.slice/crio-788437a7cdf98ce197a5fb045ad8bcfd82a65b013a8bacb3f809d10a1c28c362 WatchSource:0}: Error finding container 788437a7cdf98ce197a5fb045ad8bcfd82a65b013a8bacb3f809d10a1c28c362: Status 404 returned error can't find the container with id 788437a7cdf98ce197a5fb045ad8bcfd82a65b013a8bacb3f809d10a1c28c362 Sep 29 10:05:20 crc kubenswrapper[4922]: I0929 10:05:20.739190 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"82b45f77-ae02-47df-b1ab-5137f6e23089","Type":"ContainerStarted","Data":"fe19fc1d7a063692d1f26e8b96a8a40d02eeb8ae0b1db947176d4f8037afb77b"} Sep 29 10:05:20 crc kubenswrapper[4922]: I0929 10:05:20.739852 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"82b45f77-ae02-47df-b1ab-5137f6e23089","Type":"ContainerStarted","Data":"788437a7cdf98ce197a5fb045ad8bcfd82a65b013a8bacb3f809d10a1c28c362"} Sep 29 10:05:20 crc kubenswrapper[4922]: I0929 10:05:20.747741 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d2c1fe4-f762-40fa-8439-f74d3e234d30","Type":"ContainerStarted","Data":"9f2f5f2ed9b4d13bb1f0bfcd8a2e1f05e83dfa63a176f9dcb6d3da6bc415547a"} Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.421286 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.424176 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.428684 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.437439 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482278 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.482315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhnf\" (UniqueName: \"kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.583905 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.583969 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.583988 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.584007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhnf\" (UniqueName: \"kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.584050 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.584071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.584176 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.585425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.585438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.585520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.585601 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.586271 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.586607 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.605993 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhnf\" (UniqueName: \"kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf\") pod \"dnsmasq-dns-67b789f86c-kwvpf\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:21 crc kubenswrapper[4922]: I0929 10:05:21.751532 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:23 crc kubenswrapper[4922]: I0929 10:05:23.067197 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:23 crc kubenswrapper[4922]: I0929 10:05:23.789267 4922 generic.go:334] "Generic (PLEG): container finished" podID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerID="fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584" exitCode=0 Sep 29 10:05:23 crc kubenswrapper[4922]: I0929 10:05:23.789411 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" event={"ID":"321e5bc2-e26d-4f07-939b-a2346d51b576","Type":"ContainerDied","Data":"fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584"} Sep 29 10:05:23 crc kubenswrapper[4922]: I0929 10:05:23.789765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" event={"ID":"321e5bc2-e26d-4f07-939b-a2346d51b576","Type":"ContainerStarted","Data":"f18a57227b11a4f868600684e4bacb63a7127bc266949763073a2b077eaecdf7"} Sep 29 10:05:24 crc kubenswrapper[4922]: I0929 10:05:24.803602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" event={"ID":"321e5bc2-e26d-4f07-939b-a2346d51b576","Type":"ContainerStarted","Data":"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2"} Sep 29 10:05:24 crc kubenswrapper[4922]: I0929 10:05:24.805006 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:24 crc kubenswrapper[4922]: I0929 10:05:24.852748 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" podStartSLOduration=3.852726413 podStartE2EDuration="3.852726413s" podCreationTimestamp="2025-09-29 10:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:24.834532782 +0000 UTC m=+1250.200763076" watchObservedRunningTime="2025-09-29 10:05:24.852726413 +0000 UTC m=+1250.218956677" Sep 29 10:05:29 crc kubenswrapper[4922]: I0929 10:05:29.070507 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:05:29 crc kubenswrapper[4922]: I0929 10:05:29.071079 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:05:31 crc kubenswrapper[4922]: I0929 10:05:31.754588 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:31 crc kubenswrapper[4922]: I0929 10:05:31.841324 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:05:31 crc kubenswrapper[4922]: I0929 10:05:31.842718 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="dnsmasq-dns" containerID="cri-o://e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5" gracePeriod=10 Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.045642 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6l2dp"] Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.048392 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.138796 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6l2dp"] Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156234 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156294 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156361 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmb5\" (UniqueName: \"kubernetes.io/projected/fd5275e6-c3d3-474d-962a-3cdafc893dfd-kube-api-access-4tmb5\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156384 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156436 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.156483 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-config\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.258718 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.258844 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.258872 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.258932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmb5\" (UniqueName: \"kubernetes.io/projected/fd5275e6-c3d3-474d-962a-3cdafc893dfd-kube-api-access-4tmb5\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.258962 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.259035 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.259095 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-config\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.260303 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-config\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.260371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.261088 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.261401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.261976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.262278 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5275e6-c3d3-474d-962a-3cdafc893dfd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.301982 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmb5\" (UniqueName: \"kubernetes.io/projected/fd5275e6-c3d3-474d-962a-3cdafc893dfd-kube-api-access-4tmb5\") pod \"dnsmasq-dns-cb6ffcf87-6l2dp\" (UID: \"fd5275e6-c3d3-474d-962a-3cdafc893dfd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.421146 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.511682 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.566522 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.567184 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.567243 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.567387 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4k2p\" (UniqueName: \"kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.567417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.567454 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc\") pod \"4945ef36-899a-4e42-b95e-b5dfcca99783\" (UID: \"4945ef36-899a-4e42-b95e-b5dfcca99783\") " Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.574897 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p" (OuterVolumeSpecName: "kube-api-access-v4k2p") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "kube-api-access-v4k2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.659163 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.662846 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.667584 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.668240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.671747 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4k2p\" (UniqueName: \"kubernetes.io/projected/4945ef36-899a-4e42-b95e-b5dfcca99783-kube-api-access-v4k2p\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.671785 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.671797 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.671807 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.671817 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.683765 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config" (OuterVolumeSpecName: "config") pod "4945ef36-899a-4e42-b95e-b5dfcca99783" (UID: "4945ef36-899a-4e42-b95e-b5dfcca99783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.773891 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4945ef36-899a-4e42-b95e-b5dfcca99783-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.916704 4922 generic.go:334] "Generic (PLEG): container finished" podID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerID="e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5" exitCode=0 Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.916754 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" event={"ID":"4945ef36-899a-4e42-b95e-b5dfcca99783","Type":"ContainerDied","Data":"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5"} Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.916780 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.916788 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-7qztw" event={"ID":"4945ef36-899a-4e42-b95e-b5dfcca99783","Type":"ContainerDied","Data":"7ee9f747d30eeee34cfd0ead6c93d73d647d8428f1b59dfa46ecf27ca5ce3528"} Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.916811 4922 scope.go:117] "RemoveContainer" containerID="e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.960080 4922 scope.go:117] "RemoveContainer" containerID="ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79" Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.974117 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:05:32 crc kubenswrapper[4922]: W0929 10:05:32.982868 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5275e6_c3d3_474d_962a_3cdafc893dfd.slice/crio-1151796b4c044ff9846433a184d7cad73a6cac430a28df42d3875668d5bbb6ae WatchSource:0}: Error finding container 1151796b4c044ff9846433a184d7cad73a6cac430a28df42d3875668d5bbb6ae: Status 404 returned error can't find the container with id 1151796b4c044ff9846433a184d7cad73a6cac430a28df42d3875668d5bbb6ae Sep 29 10:05:32 crc kubenswrapper[4922]: I0929 10:05:32.992377 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6l2dp"] Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.004349 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-7qztw"] Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.206203 4922 scope.go:117] "RemoveContainer" containerID="e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5" Sep 29 10:05:33 crc kubenswrapper[4922]: E0929 10:05:33.206772 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5\": container with ID starting with e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5 not found: ID does not exist" containerID="e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5" Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.206815 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5"} err="failed to get container status \"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5\": rpc error: code = NotFound desc = could not find container \"e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5\": container with ID starting with e2765d9053e1170a18f87c826ba0624285cb52bb3e20e33f1d9f689170a392a5 not found: ID does not exist" Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.206862 4922 scope.go:117] "RemoveContainer" containerID="ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79" Sep 29 10:05:33 crc kubenswrapper[4922]: E0929 10:05:33.207140 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79\": container with ID starting with ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79 not found: ID does not exist" containerID="ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79" Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.207172 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79"} err="failed to get container status \"ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79\": rpc error: code = NotFound desc = could not find container \"ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79\": container with ID starting with ceac48f2f52f044a0c863fcb5a84df18e54fba2d7c515bf5da7eadab5920cc79 not found: ID does not exist" Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.466943 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" path="/var/lib/kubelet/pods/4945ef36-899a-4e42-b95e-b5dfcca99783/volumes" Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.931016 4922 generic.go:334] "Generic (PLEG): container finished" podID="fd5275e6-c3d3-474d-962a-3cdafc893dfd" containerID="afaf90567367d292fb95c9dadcfc0087b7243adf9583244cd3903aaff5093632" exitCode=0 Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.931148 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" event={"ID":"fd5275e6-c3d3-474d-962a-3cdafc893dfd","Type":"ContainerDied","Data":"afaf90567367d292fb95c9dadcfc0087b7243adf9583244cd3903aaff5093632"} Sep 29 10:05:33 crc kubenswrapper[4922]: I0929 10:05:33.931716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" event={"ID":"fd5275e6-c3d3-474d-962a-3cdafc893dfd","Type":"ContainerStarted","Data":"1151796b4c044ff9846433a184d7cad73a6cac430a28df42d3875668d5bbb6ae"} Sep 29 10:05:34 crc kubenswrapper[4922]: I0929 10:05:34.944896 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" event={"ID":"fd5275e6-c3d3-474d-962a-3cdafc893dfd","Type":"ContainerStarted","Data":"97909ff8f560deb15a2b94544a0efa8c15bf30bfbd3ab0abc0c8c72b44b11662"} Sep 29 10:05:34 crc kubenswrapper[4922]: I0929 10:05:34.945637 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:34 crc kubenswrapper[4922]: I0929 10:05:34.978765 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" podStartSLOduration=2.9787334149999998 podStartE2EDuration="2.978733415s" podCreationTimestamp="2025-09-29 10:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:34.964906798 +0000 UTC m=+1260.331137062" watchObservedRunningTime="2025-09-29 10:05:34.978733415 +0000 UTC m=+1260.344963679" Sep 29 10:05:42 crc kubenswrapper[4922]: I0929 10:05:42.423309 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-6l2dp" Sep 29 10:05:42 crc kubenswrapper[4922]: I0929 10:05:42.490687 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:42 crc kubenswrapper[4922]: I0929 10:05:42.491023 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="dnsmasq-dns" containerID="cri-o://26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2" gracePeriod=10 Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.020347 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.036843 4922 generic.go:334] "Generic (PLEG): container finished" podID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerID="26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2" exitCode=0 Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.036916 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" event={"ID":"321e5bc2-e26d-4f07-939b-a2346d51b576","Type":"ContainerDied","Data":"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2"} Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.036930 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.036967 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kwvpf" event={"ID":"321e5bc2-e26d-4f07-939b-a2346d51b576","Type":"ContainerDied","Data":"f18a57227b11a4f868600684e4bacb63a7127bc266949763073a2b077eaecdf7"} Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.036991 4922 scope.go:117] "RemoveContainer" containerID="26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.091097 4922 scope.go:117] "RemoveContainer" containerID="fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112435 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112529 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhnf\" (UniqueName: \"kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112742 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112805 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112850 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.112929 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0\") pod \"321e5bc2-e26d-4f07-939b-a2346d51b576\" (UID: \"321e5bc2-e26d-4f07-939b-a2346d51b576\") " Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.126252 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf" (OuterVolumeSpecName: "kube-api-access-hlhnf") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "kube-api-access-hlhnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.130900 4922 scope.go:117] "RemoveContainer" containerID="26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2" Sep 29 10:05:43 crc kubenswrapper[4922]: E0929 10:05:43.131560 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2\": container with ID starting with 26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2 not found: ID does not exist" containerID="26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.131598 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2"} err="failed to get container status \"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2\": rpc error: code = NotFound desc = could not find container \"26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2\": container with ID starting with 26d8e230f62b62e2e00070d2eb73282c55f7d0e31c9ebcc639cad007c667afe2 not found: ID does not exist" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.131624 4922 scope.go:117] "RemoveContainer" containerID="fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584" Sep 29 10:05:43 crc kubenswrapper[4922]: E0929 10:05:43.134590 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584\": container with ID starting with fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584 not found: ID does not exist" containerID="fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.134631 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584"} err="failed to get container status \"fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584\": rpc error: code = NotFound desc = could not find container \"fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584\": container with ID starting with fa3a18db5ffb21d49612dcac7eeca36b455cdc09159f6652415f25fd3e247584 not found: ID does not exist" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.178616 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.180085 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.180171 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.188553 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.192560 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config" (OuterVolumeSpecName: "config") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.193753 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "321e5bc2-e26d-4f07-939b-a2346d51b576" (UID: "321e5bc2-e26d-4f07-939b-a2346d51b576"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.216867 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.216918 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.216933 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.216948 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.216966 4922 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.217000 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/321e5bc2-e26d-4f07-939b-a2346d51b576-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.217016 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlhnf\" (UniqueName: \"kubernetes.io/projected/321e5bc2-e26d-4f07-939b-a2346d51b576-kube-api-access-hlhnf\") on node \"crc\" DevicePath \"\"" Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.378376 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.388353 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kwvpf"] Sep 29 10:05:43 crc kubenswrapper[4922]: I0929 10:05:43.464015 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" path="/var/lib/kubelet/pods/321e5bc2-e26d-4f07-939b-a2346d51b576/volumes" Sep 29 10:05:51 crc kubenswrapper[4922]: I0929 10:05:51.147030 4922 generic.go:334] "Generic (PLEG): container finished" podID="82b45f77-ae02-47df-b1ab-5137f6e23089" containerID="fe19fc1d7a063692d1f26e8b96a8a40d02eeb8ae0b1db947176d4f8037afb77b" exitCode=0 Sep 29 10:05:51 crc kubenswrapper[4922]: I0929 10:05:51.147135 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"82b45f77-ae02-47df-b1ab-5137f6e23089","Type":"ContainerDied","Data":"fe19fc1d7a063692d1f26e8b96a8a40d02eeb8ae0b1db947176d4f8037afb77b"} Sep 29 10:05:51 crc kubenswrapper[4922]: I0929 10:05:51.150005 4922 generic.go:334] "Generic (PLEG): container finished" podID="5d2c1fe4-f762-40fa-8439-f74d3e234d30" containerID="9f2f5f2ed9b4d13bb1f0bfcd8a2e1f05e83dfa63a176f9dcb6d3da6bc415547a" exitCode=0 Sep 29 10:05:51 crc kubenswrapper[4922]: I0929 10:05:51.150063 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d2c1fe4-f762-40fa-8439-f74d3e234d30","Type":"ContainerDied","Data":"9f2f5f2ed9b4d13bb1f0bfcd8a2e1f05e83dfa63a176f9dcb6d3da6bc415547a"} Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.163313 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"82b45f77-ae02-47df-b1ab-5137f6e23089","Type":"ContainerStarted","Data":"706599f918de57d7c170f9f43b2dd2a0ac67d590da9bf08333cb65b7fe24d49a"} Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.163997 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.167476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d2c1fe4-f762-40fa-8439-f74d3e234d30","Type":"ContainerStarted","Data":"cf1e02480cad540587d0975287d87c57dd22c0a0dab8518c1b40541f2826d93d"} Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.167678 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.198766 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.198745276 podStartE2EDuration="34.198745276s" podCreationTimestamp="2025-09-29 10:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:52.190901586 +0000 UTC m=+1277.557131860" watchObservedRunningTime="2025-09-29 10:05:52.198745276 +0000 UTC m=+1277.564975540" Sep 29 10:05:52 crc kubenswrapper[4922]: I0929 10:05:52.226583 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.226262242 podStartE2EDuration="34.226262242s" podCreationTimestamp="2025-09-29 10:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:05:52.219730347 +0000 UTC m=+1277.585960631" watchObservedRunningTime="2025-09-29 10:05:52.226262242 +0000 UTC m=+1277.592492496" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.921351 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t"] Sep 29 10:05:55 crc kubenswrapper[4922]: E0929 10:05:55.925960 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="init" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927030 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="init" Sep 29 10:05:55 crc kubenswrapper[4922]: E0929 10:05:55.927110 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927176 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: E0929 10:05:55.927249 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="init" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927309 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="init" Sep 29 10:05:55 crc kubenswrapper[4922]: E0929 10:05:55.927384 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927436 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927744 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="321e5bc2-e26d-4f07-939b-a2346d51b576" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.927816 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4945ef36-899a-4e42-b95e-b5dfcca99783" containerName="dnsmasq-dns" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.928711 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.939635 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.940895 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.941306 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.941488 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:05:55 crc kubenswrapper[4922]: I0929 10:05:55.980915 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t"] Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.007597 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.007905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.008115 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.008186 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jcxn\" (UniqueName: \"kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.110235 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.110689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.110759 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.110821 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jcxn\" (UniqueName: \"kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.119282 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.120108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.121023 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.135739 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jcxn\" (UniqueName: \"kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.276206 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.892822 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t"] Sep 29 10:05:56 crc kubenswrapper[4922]: I0929 10:05:56.899220 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:05:57 crc kubenswrapper[4922]: I0929 10:05:57.231578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" event={"ID":"7d952f02-09db-44fd-ae8b-6b2c8ea06505","Type":"ContainerStarted","Data":"8bf5fde7c3cf8b5568df899da63547944d7e2bb622ca3f6379086dfa0bf589c4"} Sep 29 10:05:59 crc kubenswrapper[4922]: I0929 10:05:59.070949 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:05:59 crc kubenswrapper[4922]: I0929 10:05:59.071403 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:06:08 crc kubenswrapper[4922]: I0929 10:06:08.363065 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" event={"ID":"7d952f02-09db-44fd-ae8b-6b2c8ea06505","Type":"ContainerStarted","Data":"197c602194c0fb386aef9696dcf5615079262cd843d91eec1f29482ebcd18fb3"} Sep 29 10:06:08 crc kubenswrapper[4922]: I0929 10:06:08.388383 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" podStartSLOduration=3.075876115 podStartE2EDuration="13.388354325s" podCreationTimestamp="2025-09-29 10:05:55 +0000 UTC" firstStartedPulling="2025-09-29 10:05:56.898884717 +0000 UTC m=+1282.265114981" lastFinishedPulling="2025-09-29 10:06:07.211362907 +0000 UTC m=+1292.577593191" observedRunningTime="2025-09-29 10:06:08.387400529 +0000 UTC m=+1293.753630803" watchObservedRunningTime="2025-09-29 10:06:08.388354325 +0000 UTC m=+1293.754584599" Sep 29 10:06:09 crc kubenswrapper[4922]: I0929 10:06:09.145136 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 10:06:09 crc kubenswrapper[4922]: I0929 10:06:09.464156 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 10:06:20 crc kubenswrapper[4922]: I0929 10:06:20.505083 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d952f02-09db-44fd-ae8b-6b2c8ea06505" containerID="197c602194c0fb386aef9696dcf5615079262cd843d91eec1f29482ebcd18fb3" exitCode=0 Sep 29 10:06:20 crc kubenswrapper[4922]: I0929 10:06:20.505218 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" event={"ID":"7d952f02-09db-44fd-ae8b-6b2c8ea06505","Type":"ContainerDied","Data":"197c602194c0fb386aef9696dcf5615079262cd843d91eec1f29482ebcd18fb3"} Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.049368 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.164960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key\") pod \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.165549 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle\") pod \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.165588 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory\") pod \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.165678 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jcxn\" (UniqueName: \"kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn\") pod \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\" (UID: \"7d952f02-09db-44fd-ae8b-6b2c8ea06505\") " Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.174059 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn" (OuterVolumeSpecName: "kube-api-access-9jcxn") pod "7d952f02-09db-44fd-ae8b-6b2c8ea06505" (UID: "7d952f02-09db-44fd-ae8b-6b2c8ea06505"). InnerVolumeSpecName "kube-api-access-9jcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.175032 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7d952f02-09db-44fd-ae8b-6b2c8ea06505" (UID: "7d952f02-09db-44fd-ae8b-6b2c8ea06505"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.203184 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory" (OuterVolumeSpecName: "inventory") pod "7d952f02-09db-44fd-ae8b-6b2c8ea06505" (UID: "7d952f02-09db-44fd-ae8b-6b2c8ea06505"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.203960 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d952f02-09db-44fd-ae8b-6b2c8ea06505" (UID: "7d952f02-09db-44fd-ae8b-6b2c8ea06505"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.268168 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.268231 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.268253 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d952f02-09db-44fd-ae8b-6b2c8ea06505-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.268265 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jcxn\" (UniqueName: \"kubernetes.io/projected/7d952f02-09db-44fd-ae8b-6b2c8ea06505-kube-api-access-9jcxn\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.541404 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" event={"ID":"7d952f02-09db-44fd-ae8b-6b2c8ea06505","Type":"ContainerDied","Data":"8bf5fde7c3cf8b5568df899da63547944d7e2bb622ca3f6379086dfa0bf589c4"} Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.541475 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf5fde7c3cf8b5568df899da63547944d7e2bb622ca3f6379086dfa0bf589c4" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.543120 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.663326 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds"] Sep 29 10:06:22 crc kubenswrapper[4922]: E0929 10:06:22.663948 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d952f02-09db-44fd-ae8b-6b2c8ea06505" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.663983 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d952f02-09db-44fd-ae8b-6b2c8ea06505" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.664293 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d952f02-09db-44fd-ae8b-6b2c8ea06505" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.665415 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.668150 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.668189 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.668930 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.671404 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.684271 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds"] Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.780656 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.780715 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.780975 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrj2\" (UniqueName: \"kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.883287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.883377 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.883738 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrj2\" (UniqueName: \"kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.889803 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.890893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:22 crc kubenswrapper[4922]: I0929 10:06:22.910689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrj2\" (UniqueName: \"kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2rqds\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:23 crc kubenswrapper[4922]: I0929 10:06:23.021049 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:23 crc kubenswrapper[4922]: I0929 10:06:23.587514 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds"] Sep 29 10:06:24 crc kubenswrapper[4922]: I0929 10:06:24.566086 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" event={"ID":"782111a0-a54f-49fa-a519-e0d3a68e9cbf","Type":"ContainerStarted","Data":"4002a91331cf94600cf9f6b519157777f375599bc90636f83ddad3b5fd7b9f7d"} Sep 29 10:06:24 crc kubenswrapper[4922]: I0929 10:06:24.567095 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" event={"ID":"782111a0-a54f-49fa-a519-e0d3a68e9cbf","Type":"ContainerStarted","Data":"d6fcd8796ade1ca25be33997657134aa59bfeef693547a4f0678789739359c5f"} Sep 29 10:06:24 crc kubenswrapper[4922]: I0929 10:06:24.593915 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" podStartSLOduration=2.137158754 podStartE2EDuration="2.593889019s" podCreationTimestamp="2025-09-29 10:06:22 +0000 UTC" firstStartedPulling="2025-09-29 10:06:23.592466647 +0000 UTC m=+1308.958696921" lastFinishedPulling="2025-09-29 10:06:24.049196912 +0000 UTC m=+1309.415427186" observedRunningTime="2025-09-29 10:06:24.591818564 +0000 UTC m=+1309.958048838" watchObservedRunningTime="2025-09-29 10:06:24.593889019 +0000 UTC m=+1309.960119283" Sep 29 10:06:27 crc kubenswrapper[4922]: I0929 10:06:27.612133 4922 generic.go:334] "Generic (PLEG): container finished" podID="782111a0-a54f-49fa-a519-e0d3a68e9cbf" containerID="4002a91331cf94600cf9f6b519157777f375599bc90636f83ddad3b5fd7b9f7d" exitCode=0 Sep 29 10:06:27 crc kubenswrapper[4922]: I0929 10:06:27.612246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" event={"ID":"782111a0-a54f-49fa-a519-e0d3a68e9cbf","Type":"ContainerDied","Data":"4002a91331cf94600cf9f6b519157777f375599bc90636f83ddad3b5fd7b9f7d"} Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.070727 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.071620 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.071694 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.073433 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.073544 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c" gracePeriod=600 Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.093606 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.134009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key\") pod \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.134155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory\") pod \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.134331 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqrj2\" (UniqueName: \"kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2\") pod \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\" (UID: \"782111a0-a54f-49fa-a519-e0d3a68e9cbf\") " Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.143937 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2" (OuterVolumeSpecName: "kube-api-access-tqrj2") pod "782111a0-a54f-49fa-a519-e0d3a68e9cbf" (UID: "782111a0-a54f-49fa-a519-e0d3a68e9cbf"). InnerVolumeSpecName "kube-api-access-tqrj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.203419 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory" (OuterVolumeSpecName: "inventory") pod "782111a0-a54f-49fa-a519-e0d3a68e9cbf" (UID: "782111a0-a54f-49fa-a519-e0d3a68e9cbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.210015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "782111a0-a54f-49fa-a519-e0d3a68e9cbf" (UID: "782111a0-a54f-49fa-a519-e0d3a68e9cbf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.241696 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqrj2\" (UniqueName: \"kubernetes.io/projected/782111a0-a54f-49fa-a519-e0d3a68e9cbf-kube-api-access-tqrj2\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.241753 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.241766 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/782111a0-a54f-49fa-a519-e0d3a68e9cbf-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.641132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" event={"ID":"782111a0-a54f-49fa-a519-e0d3a68e9cbf","Type":"ContainerDied","Data":"d6fcd8796ade1ca25be33997657134aa59bfeef693547a4f0678789739359c5f"} Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.641680 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6fcd8796ade1ca25be33997657134aa59bfeef693547a4f0678789739359c5f" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.641153 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2rqds" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.645742 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c" exitCode=0 Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.645803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c"} Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.645884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd"} Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.645910 4922 scope.go:117] "RemoveContainer" containerID="2a477bfa77fba14648b7136b725546b719661c46663d83dacb1d16385e73fcc2" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.731922 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z"] Sep 29 10:06:29 crc kubenswrapper[4922]: E0929 10:06:29.732596 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782111a0-a54f-49fa-a519-e0d3a68e9cbf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.732626 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="782111a0-a54f-49fa-a519-e0d3a68e9cbf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.732990 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="782111a0-a54f-49fa-a519-e0d3a68e9cbf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.747226 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.752725 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.752877 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z"] Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.753071 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.753348 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.755970 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.859619 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.859750 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.859798 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgd6\" (UniqueName: \"kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.859906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.962823 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.962981 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgd6\" (UniqueName: \"kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.963159 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.963329 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.972456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.972777 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.973384 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:29 crc kubenswrapper[4922]: I0929 10:06:29.983309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgd6\" (UniqueName: \"kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:30 crc kubenswrapper[4922]: I0929 10:06:30.093483 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:06:30 crc kubenswrapper[4922]: I0929 10:06:30.687561 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z"] Sep 29 10:06:30 crc kubenswrapper[4922]: W0929 10:06:30.696014 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5d1232_a030_44f4_823e_5c806d5dd896.slice/crio-93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67 WatchSource:0}: Error finding container 93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67: Status 404 returned error can't find the container with id 93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67 Sep 29 10:06:31 crc kubenswrapper[4922]: I0929 10:06:31.671920 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" event={"ID":"9c5d1232-a030-44f4-823e-5c806d5dd896","Type":"ContainerStarted","Data":"ced166df789a1ea6baac971de6786a1f0ba66c081ff0c9ba3cb90a742addebfc"} Sep 29 10:06:31 crc kubenswrapper[4922]: I0929 10:06:31.672790 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" event={"ID":"9c5d1232-a030-44f4-823e-5c806d5dd896","Type":"ContainerStarted","Data":"93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67"} Sep 29 10:06:31 crc kubenswrapper[4922]: I0929 10:06:31.695013 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" podStartSLOduration=2.225767364 podStartE2EDuration="2.694998033s" podCreationTimestamp="2025-09-29 10:06:29 +0000 UTC" firstStartedPulling="2025-09-29 10:06:30.69915204 +0000 UTC m=+1316.065382304" lastFinishedPulling="2025-09-29 10:06:31.168382669 +0000 UTC m=+1316.534612973" observedRunningTime="2025-09-29 10:06:31.692528867 +0000 UTC m=+1317.058759151" watchObservedRunningTime="2025-09-29 10:06:31.694998033 +0000 UTC m=+1317.061228297" Sep 29 10:06:48 crc kubenswrapper[4922]: I0929 10:06:48.795352 4922 scope.go:117] "RemoveContainer" containerID="5ce738685b7ac4bcae711e4dae129553a69d4b6d2e7ea4ec3d0c889c1e33f51e" Sep 29 10:06:48 crc kubenswrapper[4922]: I0929 10:06:48.838270 4922 scope.go:117] "RemoveContainer" containerID="c0c1513621326b7ee76988d10481ab49df8430052ed9ac09b6703845a7d4a027" Sep 29 10:07:48 crc kubenswrapper[4922]: I0929 10:07:48.957490 4922 scope.go:117] "RemoveContainer" containerID="a983506c905121861874a861f7b04dd5f7d41454b3c4bd6730d6849a8d4cc35c" Sep 29 10:08:02 crc kubenswrapper[4922]: I0929 10:08:02.838001 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:02 crc kubenswrapper[4922]: I0929 10:08:02.841294 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:02 crc kubenswrapper[4922]: I0929 10:08:02.859655 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.026092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.026462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.026590 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p74zz\" (UniqueName: \"kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.129561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.129682 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p74zz\" (UniqueName: \"kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.129809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.130722 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.131062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.152043 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p74zz\" (UniqueName: \"kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz\") pod \"redhat-operators-2zkln\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.174636 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.711526 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:03 crc kubenswrapper[4922]: I0929 10:08:03.782039 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerStarted","Data":"068a6863106fd7f80af663badd2b664c97267d84b54514cf5964c9afb7ba1e12"} Sep 29 10:08:04 crc kubenswrapper[4922]: I0929 10:08:04.796023 4922 generic.go:334] "Generic (PLEG): container finished" podID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerID="b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494" exitCode=0 Sep 29 10:08:04 crc kubenswrapper[4922]: I0929 10:08:04.796096 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerDied","Data":"b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494"} Sep 29 10:08:05 crc kubenswrapper[4922]: I0929 10:08:05.812300 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerStarted","Data":"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa"} Sep 29 10:08:06 crc kubenswrapper[4922]: I0929 10:08:06.826451 4922 generic.go:334] "Generic (PLEG): container finished" podID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerID="fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa" exitCode=0 Sep 29 10:08:06 crc kubenswrapper[4922]: I0929 10:08:06.826555 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerDied","Data":"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa"} Sep 29 10:08:07 crc kubenswrapper[4922]: I0929 10:08:07.839851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerStarted","Data":"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a"} Sep 29 10:08:07 crc kubenswrapper[4922]: I0929 10:08:07.873821 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2zkln" podStartSLOduration=3.319732279 podStartE2EDuration="5.873798825s" podCreationTimestamp="2025-09-29 10:08:02 +0000 UTC" firstStartedPulling="2025-09-29 10:08:04.798483869 +0000 UTC m=+1410.164714133" lastFinishedPulling="2025-09-29 10:08:07.352550415 +0000 UTC m=+1412.718780679" observedRunningTime="2025-09-29 10:08:07.86499431 +0000 UTC m=+1413.231224584" watchObservedRunningTime="2025-09-29 10:08:07.873798825 +0000 UTC m=+1413.240029089" Sep 29 10:08:13 crc kubenswrapper[4922]: I0929 10:08:13.175758 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:13 crc kubenswrapper[4922]: I0929 10:08:13.176506 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:13 crc kubenswrapper[4922]: I0929 10:08:13.245426 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:13 crc kubenswrapper[4922]: I0929 10:08:13.986347 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:14 crc kubenswrapper[4922]: I0929 10:08:14.048220 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:15 crc kubenswrapper[4922]: I0929 10:08:15.939730 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2zkln" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="registry-server" containerID="cri-o://565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a" gracePeriod=2 Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.464320 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.555750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p74zz\" (UniqueName: \"kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz\") pod \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.556134 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content\") pod \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.556256 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities\") pod \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\" (UID: \"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e\") " Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.557082 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities" (OuterVolumeSpecName: "utilities") pod "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" (UID: "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.563575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz" (OuterVolumeSpecName: "kube-api-access-p74zz") pod "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" (UID: "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e"). InnerVolumeSpecName "kube-api-access-p74zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.659184 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.659237 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p74zz\" (UniqueName: \"kubernetes.io/projected/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-kube-api-access-p74zz\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.684120 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" (UID: "24384a1e-46a1-4bcb-9f2c-6b77a95bda0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.761229 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.955371 4922 generic.go:334] "Generic (PLEG): container finished" podID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerID="565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a" exitCode=0 Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.955425 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerDied","Data":"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a"} Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.955441 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zkln" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.955472 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zkln" event={"ID":"24384a1e-46a1-4bcb-9f2c-6b77a95bda0e","Type":"ContainerDied","Data":"068a6863106fd7f80af663badd2b664c97267d84b54514cf5964c9afb7ba1e12"} Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.955515 4922 scope.go:117] "RemoveContainer" containerID="565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a" Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.992578 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:16 crc kubenswrapper[4922]: I0929 10:08:16.993130 4922 scope.go:117] "RemoveContainer" containerID="fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.004221 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2zkln"] Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.035117 4922 scope.go:117] "RemoveContainer" containerID="b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.077238 4922 scope.go:117] "RemoveContainer" containerID="565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a" Sep 29 10:08:17 crc kubenswrapper[4922]: E0929 10:08:17.077888 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a\": container with ID starting with 565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a not found: ID does not exist" containerID="565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.077919 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a"} err="failed to get container status \"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a\": rpc error: code = NotFound desc = could not find container \"565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a\": container with ID starting with 565d8eb0c65f3c2fa1990dbccb7f392f3039af914a8f91382b208024a8820a6a not found: ID does not exist" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.077941 4922 scope.go:117] "RemoveContainer" containerID="fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa" Sep 29 10:08:17 crc kubenswrapper[4922]: E0929 10:08:17.078383 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa\": container with ID starting with fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa not found: ID does not exist" containerID="fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.078405 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa"} err="failed to get container status \"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa\": rpc error: code = NotFound desc = could not find container \"fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa\": container with ID starting with fb7e8fd5d11f7819264f9c7f9422741b590b5981fa9301c37081ac81664f31aa not found: ID does not exist" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.078425 4922 scope.go:117] "RemoveContainer" containerID="b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494" Sep 29 10:08:17 crc kubenswrapper[4922]: E0929 10:08:17.078887 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494\": container with ID starting with b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494 not found: ID does not exist" containerID="b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.078911 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494"} err="failed to get container status \"b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494\": rpc error: code = NotFound desc = could not find container \"b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494\": container with ID starting with b53afad4f508f10dd75ba9bf2a39433f8a85f2699b4084fb8780759a41b5b494 not found: ID does not exist" Sep 29 10:08:17 crc kubenswrapper[4922]: I0929 10:08:17.487916 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" path="/var/lib/kubelet/pods/24384a1e-46a1-4bcb-9f2c-6b77a95bda0e/volumes" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.857139 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:20 crc kubenswrapper[4922]: E0929 10:08:20.859622 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="extract-utilities" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.859758 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="extract-utilities" Sep 29 10:08:20 crc kubenswrapper[4922]: E0929 10:08:20.859857 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="extract-content" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.859927 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="extract-content" Sep 29 10:08:20 crc kubenswrapper[4922]: E0929 10:08:20.859998 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="registry-server" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.860061 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="registry-server" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.860363 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="24384a1e-46a1-4bcb-9f2c-6b77a95bda0e" containerName="registry-server" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.862378 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.873156 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.956690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.956751 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:20 crc kubenswrapper[4922]: I0929 10:08:20.956795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7gn\" (UniqueName: \"kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.059208 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.059656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.059702 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7gn\" (UniqueName: \"kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.059824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.060107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.082672 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7gn\" (UniqueName: \"kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn\") pod \"certified-operators-wn9w7\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.190538 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:21 crc kubenswrapper[4922]: W0929 10:08:21.632878 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2f151a_9f8e_439c_a29d_2ffd0d26cc39.slice/crio-070961d5d31daa5837f84e39ac4f4770f15c9b3f6a86a637eb015d1cc7a22610 WatchSource:0}: Error finding container 070961d5d31daa5837f84e39ac4f4770f15c9b3f6a86a637eb015d1cc7a22610: Status 404 returned error can't find the container with id 070961d5d31daa5837f84e39ac4f4770f15c9b3f6a86a637eb015d1cc7a22610 Sep 29 10:08:21 crc kubenswrapper[4922]: I0929 10:08:21.632909 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:22 crc kubenswrapper[4922]: I0929 10:08:22.013882 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerID="d9fca7298b8a24cebf4c5b617339fefa89d9b410d17068d62b08ec9eb24daa5f" exitCode=0 Sep 29 10:08:22 crc kubenswrapper[4922]: I0929 10:08:22.013973 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerDied","Data":"d9fca7298b8a24cebf4c5b617339fefa89d9b410d17068d62b08ec9eb24daa5f"} Sep 29 10:08:22 crc kubenswrapper[4922]: I0929 10:08:22.014613 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerStarted","Data":"070961d5d31daa5837f84e39ac4f4770f15c9b3f6a86a637eb015d1cc7a22610"} Sep 29 10:08:24 crc kubenswrapper[4922]: I0929 10:08:24.041391 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerID="cebd45d34c58a5757a74c7ccf613e420aa9bf4d4673563a0ded7f6d318568c70" exitCode=0 Sep 29 10:08:24 crc kubenswrapper[4922]: I0929 10:08:24.041778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerDied","Data":"cebd45d34c58a5757a74c7ccf613e420aa9bf4d4673563a0ded7f6d318568c70"} Sep 29 10:08:25 crc kubenswrapper[4922]: I0929 10:08:25.056115 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerStarted","Data":"62273e78d9710a4022d9c4d0e88c73f892d3ead761c1c6181bed6db18ed09522"} Sep 29 10:08:25 crc kubenswrapper[4922]: I0929 10:08:25.077319 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wn9w7" podStartSLOduration=2.535333107 podStartE2EDuration="5.077296329s" podCreationTimestamp="2025-09-29 10:08:20 +0000 UTC" firstStartedPulling="2025-09-29 10:08:22.016405458 +0000 UTC m=+1427.382635722" lastFinishedPulling="2025-09-29 10:08:24.55836868 +0000 UTC m=+1429.924598944" observedRunningTime="2025-09-29 10:08:25.076085356 +0000 UTC m=+1430.442315620" watchObservedRunningTime="2025-09-29 10:08:25.077296329 +0000 UTC m=+1430.443526613" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.071141 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.071246 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.441590 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.456879 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.490558 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.630844 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.630983 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlp4t\" (UniqueName: \"kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.631089 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.733482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.733620 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.733676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlp4t\" (UniqueName: \"kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.734638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.734780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.767753 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlp4t\" (UniqueName: \"kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t\") pod \"redhat-marketplace-vmzw4\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:29 crc kubenswrapper[4922]: I0929 10:08:29.800634 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:30 crc kubenswrapper[4922]: I0929 10:08:30.347044 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.122410 4922 generic.go:334] "Generic (PLEG): container finished" podID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerID="f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3" exitCode=0 Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.122498 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerDied","Data":"f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3"} Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.122584 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerStarted","Data":"0fd67aefeac390511f2b36e1452f9e72b251e71a6011233647e619c7f59fe991"} Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.190845 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.191219 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:31 crc kubenswrapper[4922]: I0929 10:08:31.246696 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:32 crc kubenswrapper[4922]: I0929 10:08:32.184942 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:33 crc kubenswrapper[4922]: I0929 10:08:33.609116 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:34 crc kubenswrapper[4922]: I0929 10:08:34.157522 4922 generic.go:334] "Generic (PLEG): container finished" podID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerID="e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39" exitCode=0 Sep 29 10:08:34 crc kubenswrapper[4922]: I0929 10:08:34.157639 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerDied","Data":"e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39"} Sep 29 10:08:34 crc kubenswrapper[4922]: I0929 10:08:34.158212 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wn9w7" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="registry-server" containerID="cri-o://62273e78d9710a4022d9c4d0e88c73f892d3ead761c1c6181bed6db18ed09522" gracePeriod=2 Sep 29 10:08:35 crc kubenswrapper[4922]: I0929 10:08:35.171582 4922 generic.go:334] "Generic (PLEG): container finished" podID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerID="62273e78d9710a4022d9c4d0e88c73f892d3ead761c1c6181bed6db18ed09522" exitCode=0 Sep 29 10:08:35 crc kubenswrapper[4922]: I0929 10:08:35.171811 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerDied","Data":"62273e78d9710a4022d9c4d0e88c73f892d3ead761c1c6181bed6db18ed09522"} Sep 29 10:08:35 crc kubenswrapper[4922]: I0929 10:08:35.988043 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.095719 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities\") pod \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.096019 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content\") pod \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.096050 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x7gn\" (UniqueName: \"kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn\") pod \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\" (UID: \"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39\") " Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.096809 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities" (OuterVolumeSpecName: "utilities") pod "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" (UID: "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.106729 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn" (OuterVolumeSpecName: "kube-api-access-6x7gn") pod "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" (UID: "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39"). InnerVolumeSpecName "kube-api-access-6x7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.142746 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" (UID: "8a2f151a-9f8e-439c-a29d-2ffd0d26cc39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.187931 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wn9w7" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.188224 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wn9w7" event={"ID":"8a2f151a-9f8e-439c-a29d-2ffd0d26cc39","Type":"ContainerDied","Data":"070961d5d31daa5837f84e39ac4f4770f15c9b3f6a86a637eb015d1cc7a22610"} Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.188325 4922 scope.go:117] "RemoveContainer" containerID="62273e78d9710a4022d9c4d0e88c73f892d3ead761c1c6181bed6db18ed09522" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.190917 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerStarted","Data":"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77"} Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.198946 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.198993 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.199009 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x7gn\" (UniqueName: \"kubernetes.io/projected/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39-kube-api-access-6x7gn\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.218651 4922 scope.go:117] "RemoveContainer" containerID="cebd45d34c58a5757a74c7ccf613e420aa9bf4d4673563a0ded7f6d318568c70" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.222952 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vmzw4" podStartSLOduration=3.087074139 podStartE2EDuration="7.2229343s" podCreationTimestamp="2025-09-29 10:08:29 +0000 UTC" firstStartedPulling="2025-09-29 10:08:31.125110802 +0000 UTC m=+1436.491341066" lastFinishedPulling="2025-09-29 10:08:35.260970943 +0000 UTC m=+1440.627201227" observedRunningTime="2025-09-29 10:08:36.212634705 +0000 UTC m=+1441.578864969" watchObservedRunningTime="2025-09-29 10:08:36.2229343 +0000 UTC m=+1441.589164564" Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.239525 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.249212 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wn9w7"] Sep 29 10:08:36 crc kubenswrapper[4922]: I0929 10:08:36.256979 4922 scope.go:117] "RemoveContainer" containerID="d9fca7298b8a24cebf4c5b617339fefa89d9b410d17068d62b08ec9eb24daa5f" Sep 29 10:08:37 crc kubenswrapper[4922]: I0929 10:08:37.474328 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" path="/var/lib/kubelet/pods/8a2f151a-9f8e-439c-a29d-2ffd0d26cc39/volumes" Sep 29 10:08:39 crc kubenswrapper[4922]: I0929 10:08:39.800873 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:39 crc kubenswrapper[4922]: I0929 10:08:39.801439 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:39 crc kubenswrapper[4922]: I0929 10:08:39.861496 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:40 crc kubenswrapper[4922]: I0929 10:08:40.299911 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:41 crc kubenswrapper[4922]: I0929 10:08:41.206261 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.255264 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vmzw4" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="registry-server" containerID="cri-o://bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77" gracePeriod=2 Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.813477 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.965418 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlp4t\" (UniqueName: \"kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t\") pod \"73524012-4940-434f-8e49-c1ff37cf8fc2\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.965572 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities\") pod \"73524012-4940-434f-8e49-c1ff37cf8fc2\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.965979 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content\") pod \"73524012-4940-434f-8e49-c1ff37cf8fc2\" (UID: \"73524012-4940-434f-8e49-c1ff37cf8fc2\") " Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.967106 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities" (OuterVolumeSpecName: "utilities") pod "73524012-4940-434f-8e49-c1ff37cf8fc2" (UID: "73524012-4940-434f-8e49-c1ff37cf8fc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.973712 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t" (OuterVolumeSpecName: "kube-api-access-vlp4t") pod "73524012-4940-434f-8e49-c1ff37cf8fc2" (UID: "73524012-4940-434f-8e49-c1ff37cf8fc2"). InnerVolumeSpecName "kube-api-access-vlp4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:08:42 crc kubenswrapper[4922]: I0929 10:08:42.981706 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73524012-4940-434f-8e49-c1ff37cf8fc2" (UID: "73524012-4940-434f-8e49-c1ff37cf8fc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.069425 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.069491 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73524012-4940-434f-8e49-c1ff37cf8fc2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.069509 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlp4t\" (UniqueName: \"kubernetes.io/projected/73524012-4940-434f-8e49-c1ff37cf8fc2-kube-api-access-vlp4t\") on node \"crc\" DevicePath \"\"" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.277470 4922 generic.go:334] "Generic (PLEG): container finished" podID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerID="bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77" exitCode=0 Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.277547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerDied","Data":"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77"} Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.277555 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmzw4" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.277605 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmzw4" event={"ID":"73524012-4940-434f-8e49-c1ff37cf8fc2","Type":"ContainerDied","Data":"0fd67aefeac390511f2b36e1452f9e72b251e71a6011233647e619c7f59fe991"} Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.277636 4922 scope.go:117] "RemoveContainer" containerID="bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.309004 4922 scope.go:117] "RemoveContainer" containerID="e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.320974 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.330615 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmzw4"] Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.360109 4922 scope.go:117] "RemoveContainer" containerID="f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.398448 4922 scope.go:117] "RemoveContainer" containerID="bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77" Sep 29 10:08:43 crc kubenswrapper[4922]: E0929 10:08:43.400199 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77\": container with ID starting with bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77 not found: ID does not exist" containerID="bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.400311 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77"} err="failed to get container status \"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77\": rpc error: code = NotFound desc = could not find container \"bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77\": container with ID starting with bf7402639f4b892e86c2c3da8dbcb8b3aee63ea37e7a6aad28eb207ad55bfc77 not found: ID does not exist" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.400378 4922 scope.go:117] "RemoveContainer" containerID="e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39" Sep 29 10:08:43 crc kubenswrapper[4922]: E0929 10:08:43.401114 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39\": container with ID starting with e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39 not found: ID does not exist" containerID="e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.401187 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39"} err="failed to get container status \"e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39\": rpc error: code = NotFound desc = could not find container \"e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39\": container with ID starting with e2e54e95914fa47f8bbb13ab9a262d853ea2dc7a6e4e5e06930b5b0372865f39 not found: ID does not exist" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.401231 4922 scope.go:117] "RemoveContainer" containerID="f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3" Sep 29 10:08:43 crc kubenswrapper[4922]: E0929 10:08:43.401691 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3\": container with ID starting with f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3 not found: ID does not exist" containerID="f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.401728 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3"} err="failed to get container status \"f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3\": rpc error: code = NotFound desc = could not find container \"f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3\": container with ID starting with f016c1c96ea7b12389c77fa268c5b3ee162e3c3ba137fbce12b0ece0d97ea1c3 not found: ID does not exist" Sep 29 10:08:43 crc kubenswrapper[4922]: I0929 10:08:43.467166 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" path="/var/lib/kubelet/pods/73524012-4940-434f-8e49-c1ff37cf8fc2/volumes" Sep 29 10:08:59 crc kubenswrapper[4922]: I0929 10:08:59.071371 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:08:59 crc kubenswrapper[4922]: I0929 10:08:59.072350 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.071486 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.072540 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.072634 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.074046 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.074145 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" gracePeriod=600 Sep 29 10:09:29 crc kubenswrapper[4922]: E0929 10:09:29.282092 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.803703 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" exitCode=0 Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.803778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd"} Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.803889 4922 scope.go:117] "RemoveContainer" containerID="0a1549b8b442c454d49bdd016344f1f8e0f5b0aa9b4f4d0ded96439b8c2d215c" Sep 29 10:09:29 crc kubenswrapper[4922]: I0929 10:09:29.805536 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:09:29 crc kubenswrapper[4922]: E0929 10:09:29.806349 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:09:43 crc kubenswrapper[4922]: I0929 10:09:43.451993 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:09:43 crc kubenswrapper[4922]: E0929 10:09:43.453465 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:09:50 crc kubenswrapper[4922]: I0929 10:09:50.066459 4922 generic.go:334] "Generic (PLEG): container finished" podID="9c5d1232-a030-44f4-823e-5c806d5dd896" containerID="ced166df789a1ea6baac971de6786a1f0ba66c081ff0c9ba3cb90a742addebfc" exitCode=0 Sep 29 10:09:50 crc kubenswrapper[4922]: I0929 10:09:50.066535 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" event={"ID":"9c5d1232-a030-44f4-823e-5c806d5dd896","Type":"ContainerDied","Data":"ced166df789a1ea6baac971de6786a1f0ba66c081ff0c9ba3cb90a742addebfc"} Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.623388 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.748848 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory\") pod \"9c5d1232-a030-44f4-823e-5c806d5dd896\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.748997 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key\") pod \"9c5d1232-a030-44f4-823e-5c806d5dd896\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.749043 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle\") pod \"9c5d1232-a030-44f4-823e-5c806d5dd896\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.749131 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsgd6\" (UniqueName: \"kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6\") pod \"9c5d1232-a030-44f4-823e-5c806d5dd896\" (UID: \"9c5d1232-a030-44f4-823e-5c806d5dd896\") " Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.757502 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9c5d1232-a030-44f4-823e-5c806d5dd896" (UID: "9c5d1232-a030-44f4-823e-5c806d5dd896"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.759173 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6" (OuterVolumeSpecName: "kube-api-access-qsgd6") pod "9c5d1232-a030-44f4-823e-5c806d5dd896" (UID: "9c5d1232-a030-44f4-823e-5c806d5dd896"). InnerVolumeSpecName "kube-api-access-qsgd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.784645 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c5d1232-a030-44f4-823e-5c806d5dd896" (UID: "9c5d1232-a030-44f4-823e-5c806d5dd896"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.787625 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory" (OuterVolumeSpecName: "inventory") pod "9c5d1232-a030-44f4-823e-5c806d5dd896" (UID: "9c5d1232-a030-44f4-823e-5c806d5dd896"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.852192 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.852249 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.852266 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5d1232-a030-44f4-823e-5c806d5dd896-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:51 crc kubenswrapper[4922]: I0929 10:09:51.852284 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsgd6\" (UniqueName: \"kubernetes.io/projected/9c5d1232-a030-44f4-823e-5c806d5dd896-kube-api-access-qsgd6\") on node \"crc\" DevicePath \"\"" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.089554 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" event={"ID":"9c5d1232-a030-44f4-823e-5c806d5dd896","Type":"ContainerDied","Data":"93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67"} Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.090056 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93824d24e5bbf4ef03c26f23372143b38e9915b9a209b80c15338b6338daac67" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.089716 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.203688 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5"] Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204171 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="extract-content" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204191 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="extract-content" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204203 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204211 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204228 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5d1232-a030-44f4-823e-5c806d5dd896" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204237 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5d1232-a030-44f4-823e-5c806d5dd896" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204270 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="extract-utilities" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204277 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="extract-utilities" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204288 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204295 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204304 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="extract-content" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204309 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="extract-content" Sep 29 10:09:52 crc kubenswrapper[4922]: E0929 10:09:52.204332 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="extract-utilities" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204338 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="extract-utilities" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204529 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2f151a-9f8e-439c-a29d-2ffd0d26cc39" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204547 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5d1232-a030-44f4-823e-5c806d5dd896" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.204565 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="73524012-4940-434f-8e49-c1ff37cf8fc2" containerName="registry-server" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.205276 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.208340 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.209356 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.209747 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.209985 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.224454 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5"] Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.363681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.363777 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.363939 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8q85\" (UniqueName: \"kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.466009 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8q85\" (UniqueName: \"kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.466169 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.466237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.471738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.477380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.489500 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8q85\" (UniqueName: \"kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gngd5\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:52 crc kubenswrapper[4922]: I0929 10:09:52.526805 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:09:53 crc kubenswrapper[4922]: I0929 10:09:53.062230 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5"] Sep 29 10:09:53 crc kubenswrapper[4922]: I0929 10:09:53.105203 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" event={"ID":"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8","Type":"ContainerStarted","Data":"d4f30e356bfa09c59b2e452a84a944787c524d1835885d9fc8660726f9a2514b"} Sep 29 10:09:54 crc kubenswrapper[4922]: I0929 10:09:54.117857 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" event={"ID":"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8","Type":"ContainerStarted","Data":"aebe3a7db7202230d1b6595dd30f9d65199c2efd01a34370738c1a4243da72cd"} Sep 29 10:09:55 crc kubenswrapper[4922]: I0929 10:09:55.155272 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" podStartSLOduration=2.373874782 podStartE2EDuration="3.15524584s" podCreationTimestamp="2025-09-29 10:09:52 +0000 UTC" firstStartedPulling="2025-09-29 10:09:53.071631084 +0000 UTC m=+1518.437861348" lastFinishedPulling="2025-09-29 10:09:53.853002142 +0000 UTC m=+1519.219232406" observedRunningTime="2025-09-29 10:09:55.144386909 +0000 UTC m=+1520.510617173" watchObservedRunningTime="2025-09-29 10:09:55.15524584 +0000 UTC m=+1520.521476104" Sep 29 10:09:58 crc kubenswrapper[4922]: I0929 10:09:58.453880 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:09:58 crc kubenswrapper[4922]: E0929 10:09:58.456381 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:10:12 crc kubenswrapper[4922]: I0929 10:10:12.454149 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:10:12 crc kubenswrapper[4922]: E0929 10:10:12.455408 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.055396 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gjlts"] Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.065574 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lsq42"] Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.084882 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gjlts"] Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.093727 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lsq42"] Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.470071 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18063e86-56aa-470c-a41d-d7965d242a20" path="/var/lib/kubelet/pods/18063e86-56aa-470c-a41d-d7965d242a20/volumes" Sep 29 10:10:13 crc kubenswrapper[4922]: I0929 10:10:13.470873 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff184c2c-608a-4711-a2fe-5c8ffe2d64ac" path="/var/lib/kubelet/pods/ff184c2c-608a-4711-a2fe-5c8ffe2d64ac/volumes" Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.045788 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-k5qmh"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.056898 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-980b-account-create-z5h6z"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.068897 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b8ce-account-create-5lq5r"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.079041 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-980b-account-create-z5h6z"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.087678 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b8ce-account-create-5lq5r"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.098145 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-k5qmh"] Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.461187 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:10:25 crc kubenswrapper[4922]: E0929 10:10:25.461640 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.464399 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62832609-e9cd-41a5-a77e-4fdbe35cd12e" path="/var/lib/kubelet/pods/62832609-e9cd-41a5-a77e-4fdbe35cd12e/volumes" Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.465172 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f79e4ba-2854-4d57-9d76-391f020c62ce" path="/var/lib/kubelet/pods/8f79e4ba-2854-4d57-9d76-391f020c62ce/volumes" Sep 29 10:10:25 crc kubenswrapper[4922]: I0929 10:10:25.465780 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b58c075-8b99-4903-8e23-973f208b4edd" path="/var/lib/kubelet/pods/9b58c075-8b99-4903-8e23-973f208b4edd/volumes" Sep 29 10:10:39 crc kubenswrapper[4922]: I0929 10:10:39.453014 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:10:39 crc kubenswrapper[4922]: E0929 10:10:39.454065 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:10:46 crc kubenswrapper[4922]: I0929 10:10:46.038695 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c110-account-create-mxfv8"] Sep 29 10:10:46 crc kubenswrapper[4922]: I0929 10:10:46.050561 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c110-account-create-mxfv8"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.033935 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mcgfr"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.043873 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qxvjk"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.054927 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mcgfr"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.064578 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qxvjk"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.073589 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gmsv6"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.084188 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gmsv6"] Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.472390 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e9cd745-1b41-41dd-96a7-7e67ef51684d" path="/var/lib/kubelet/pods/6e9cd745-1b41-41dd-96a7-7e67ef51684d/volumes" Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.474224 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e007462-51b9-4640-94b1-019c85704aed" path="/var/lib/kubelet/pods/7e007462-51b9-4640-94b1-019c85704aed/volumes" Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.475098 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8c7f07-5d37-4501-9c88-32cff699802f" path="/var/lib/kubelet/pods/9b8c7f07-5d37-4501-9c88-32cff699802f/volumes" Sep 29 10:10:47 crc kubenswrapper[4922]: I0929 10:10:47.475809 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1420681-6f1a-40a2-8176-32fcff81af93" path="/var/lib/kubelet/pods/e1420681-6f1a-40a2-8176-32fcff81af93/volumes" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.196668 4922 scope.go:117] "RemoveContainer" containerID="cf366e5f7a000734a4eac4b1ed2cef2d2e1895363ff318bd4179a2051c5ef630" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.224293 4922 scope.go:117] "RemoveContainer" containerID="7b845b20c851b4631ce7f0d840910c73eab887da9fb50ef8a3ae28bf683bb115" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.250795 4922 scope.go:117] "RemoveContainer" containerID="00a5e1c9248f76ae152bbd86fedb3abea340b22063905e555de49e63678f20ae" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.309691 4922 scope.go:117] "RemoveContainer" containerID="7aac3f3cfcfccddafbdf693611a54a01f3b11a7739c4cebab1f3a5bded90f11d" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.341597 4922 scope.go:117] "RemoveContainer" containerID="a4b9d9eed4bdc55a21030da4539f5f95a670836403ee1647feec04ae0bfb5980" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.390777 4922 scope.go:117] "RemoveContainer" containerID="9b96d8027206a5ca9d0c46914b95f16ada9699ceb7770928b1a075ecf4c76af6" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.436923 4922 scope.go:117] "RemoveContainer" containerID="478fac433f081b6cf14a95627c1dc9eadcf24b141f8c90e7d09ca36e07bdd72c" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.462174 4922 scope.go:117] "RemoveContainer" containerID="0c0a5d3986f52bd4a0cfe40389c3bcd945037a8d8ce88d86ebe1b0612b7d8d8d" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.524733 4922 scope.go:117] "RemoveContainer" containerID="b612715a436f5665726ee3a58544af6b5dc218ef8f48ad388d5331f9138ef0ad" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.555939 4922 scope.go:117] "RemoveContainer" containerID="f94cf50a1f1bd578e71cb01ddce02d66dfb7793da43fa42ee39478637362a8a0" Sep 29 10:10:49 crc kubenswrapper[4922]: I0929 10:10:49.579456 4922 scope.go:117] "RemoveContainer" containerID="83ac3deff58fb0730474627dd5e1c2225bc05ba71b520fa7b4a6d372f7bbca9d" Sep 29 10:10:52 crc kubenswrapper[4922]: I0929 10:10:52.452004 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:10:52 crc kubenswrapper[4922]: E0929 10:10:52.452704 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:10:55 crc kubenswrapper[4922]: I0929 10:10:55.051784 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-57wm2"] Sep 29 10:10:55 crc kubenswrapper[4922]: I0929 10:10:55.059002 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-57wm2"] Sep 29 10:10:55 crc kubenswrapper[4922]: I0929 10:10:55.471536 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd503cc-5b91-4ee8-b354-ada3ba37812a" path="/var/lib/kubelet/pods/8fd503cc-5b91-4ee8-b354-ada3ba37812a/volumes" Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.049645 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ccc8-account-create-ss7k4"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.058275 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-902c-account-create-d724p"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.068729 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ccc8-account-create-ss7k4"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.080460 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-55rl7"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.093972 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-902c-account-create-d724p"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.107375 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-55rl7"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.119998 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e940-account-create-6pnx5"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.129099 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e940-account-create-6pnx5"] Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.470280 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51967537-baef-4e1f-a056-fc90648a3193" path="/var/lib/kubelet/pods/51967537-baef-4e1f-a056-fc90648a3193/volumes" Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.471042 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abd65ed-7467-4e06-91a3-8190c697c779" path="/var/lib/kubelet/pods/6abd65ed-7467-4e06-91a3-8190c697c779/volumes" Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.471817 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5832265-b311-40bd-acbb-578a7c35814f" path="/var/lib/kubelet/pods/a5832265-b311-40bd-acbb-578a7c35814f/volumes" Sep 29 10:11:01 crc kubenswrapper[4922]: I0929 10:11:01.472517 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d471b234-b711-4771-a1c7-818c56789a93" path="/var/lib/kubelet/pods/d471b234-b711-4771-a1c7-818c56789a93/volumes" Sep 29 10:11:05 crc kubenswrapper[4922]: I0929 10:11:05.463849 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:11:05 crc kubenswrapper[4922]: E0929 10:11:05.464749 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:11:18 crc kubenswrapper[4922]: I0929 10:11:18.452869 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:11:18 crc kubenswrapper[4922]: E0929 10:11:18.454394 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:11:31 crc kubenswrapper[4922]: I0929 10:11:31.455117 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:11:31 crc kubenswrapper[4922]: E0929 10:11:31.455998 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:11:38 crc kubenswrapper[4922]: I0929 10:11:38.276511 4922 generic.go:334] "Generic (PLEG): container finished" podID="3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" containerID="aebe3a7db7202230d1b6595dd30f9d65199c2efd01a34370738c1a4243da72cd" exitCode=0 Sep 29 10:11:38 crc kubenswrapper[4922]: I0929 10:11:38.276602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" event={"ID":"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8","Type":"ContainerDied","Data":"aebe3a7db7202230d1b6595dd30f9d65199c2efd01a34370738c1a4243da72cd"} Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.051676 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nxhsm"] Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.064176 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nxhsm"] Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.468438 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ae426f-29ac-46dd-a865-41b4c4a0e722" path="/var/lib/kubelet/pods/27ae426f-29ac-46dd-a865-41b4c4a0e722/volumes" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.717968 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.824020 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key\") pod \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.824222 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8q85\" (UniqueName: \"kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85\") pod \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.824327 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory\") pod \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\" (UID: \"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8\") " Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.846898 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85" (OuterVolumeSpecName: "kube-api-access-x8q85") pod "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" (UID: "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8"). InnerVolumeSpecName "kube-api-access-x8q85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.859147 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory" (OuterVolumeSpecName: "inventory") pod "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" (UID: "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.867962 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" (UID: "3cbc70f7-2707-430a-a8d1-d33aee8c7ae8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.928288 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8q85\" (UniqueName: \"kubernetes.io/projected/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-kube-api-access-x8q85\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.928343 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:39 crc kubenswrapper[4922]: I0929 10:11:39.928357 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbc70f7-2707-430a-a8d1-d33aee8c7ae8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.301188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" event={"ID":"3cbc70f7-2707-430a-a8d1-d33aee8c7ae8","Type":"ContainerDied","Data":"d4f30e356bfa09c59b2e452a84a944787c524d1835885d9fc8660726f9a2514b"} Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.301657 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f30e356bfa09c59b2e452a84a944787c524d1835885d9fc8660726f9a2514b" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.301263 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gngd5" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.390933 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9"] Sep 29 10:11:40 crc kubenswrapper[4922]: E0929 10:11:40.391474 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.391495 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.391704 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cbc70f7-2707-430a-a8d1-d33aee8c7ae8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.392457 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.407779 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.408523 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.408735 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.408956 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.416343 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9"] Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.436112 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.436215 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp8lf\" (UniqueName: \"kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.436300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.538198 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.538279 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp8lf\" (UniqueName: \"kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.538325 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.552696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.553447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.558625 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp8lf\" (UniqueName: \"kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:40 crc kubenswrapper[4922]: I0929 10:11:40.749683 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:11:41 crc kubenswrapper[4922]: I0929 10:11:41.145288 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9"] Sep 29 10:11:41 crc kubenswrapper[4922]: I0929 10:11:41.158500 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:11:41 crc kubenswrapper[4922]: I0929 10:11:41.315469 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" event={"ID":"b7891137-eddd-4865-9a35-f32a72a1f206","Type":"ContainerStarted","Data":"666bc69995e5d874b9d47801c265dcbad00ce94fc93e332bda58688367a06537"} Sep 29 10:11:42 crc kubenswrapper[4922]: I0929 10:11:42.328280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" event={"ID":"b7891137-eddd-4865-9a35-f32a72a1f206","Type":"ContainerStarted","Data":"397f4687debd2dbfd05aeb69cac2166ee07cc15314893326d5601b225bdde70e"} Sep 29 10:11:42 crc kubenswrapper[4922]: I0929 10:11:42.357285 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" podStartSLOduration=1.758974687 podStartE2EDuration="2.357250308s" podCreationTimestamp="2025-09-29 10:11:40 +0000 UTC" firstStartedPulling="2025-09-29 10:11:41.15818599 +0000 UTC m=+1626.524416264" lastFinishedPulling="2025-09-29 10:11:41.756461621 +0000 UTC m=+1627.122691885" observedRunningTime="2025-09-29 10:11:42.347294702 +0000 UTC m=+1627.713524976" watchObservedRunningTime="2025-09-29 10:11:42.357250308 +0000 UTC m=+1627.723480592" Sep 29 10:11:43 crc kubenswrapper[4922]: I0929 10:11:43.452979 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:11:43 crc kubenswrapper[4922]: E0929 10:11:43.453702 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:11:48 crc kubenswrapper[4922]: I0929 10:11:48.045840 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jv255"] Sep 29 10:11:48 crc kubenswrapper[4922]: I0929 10:11:48.057666 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jv255"] Sep 29 10:11:48 crc kubenswrapper[4922]: I0929 10:11:48.067590 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x7rtg"] Sep 29 10:11:48 crc kubenswrapper[4922]: I0929 10:11:48.077923 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x7rtg"] Sep 29 10:11:49 crc kubenswrapper[4922]: I0929 10:11:49.478247 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8fc131-e5f1-45f2-9c6c-1050ca368d5a" path="/var/lib/kubelet/pods/1b8fc131-e5f1-45f2-9c6c-1050ca368d5a/volumes" Sep 29 10:11:49 crc kubenswrapper[4922]: I0929 10:11:49.479721 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b100f6-2a77-43b8-8942-0e50151142d0" path="/var/lib/kubelet/pods/e4b100f6-2a77-43b8-8942-0e50151142d0/volumes" Sep 29 10:11:49 crc kubenswrapper[4922]: I0929 10:11:49.836278 4922 scope.go:117] "RemoveContainer" containerID="3f34acd702c80793661ad84371ea0d7601da79129b46adb57966367a6bf794da" Sep 29 10:11:49 crc kubenswrapper[4922]: I0929 10:11:49.864088 4922 scope.go:117] "RemoveContainer" containerID="387201c090373052742813abe5c2c5cff6e3729873fba0d13266f1c498a49319" Sep 29 10:11:49 crc kubenswrapper[4922]: I0929 10:11:49.949947 4922 scope.go:117] "RemoveContainer" containerID="d7f13a60bfe17a8e6b865b9e72792711035c6a568ca0bf6357b24e2b42bf6ae5" Sep 29 10:11:50 crc kubenswrapper[4922]: I0929 10:11:50.003637 4922 scope.go:117] "RemoveContainer" containerID="eb648f4baca8d0fce51976e46dba225624eb8e66bb4a457c728ece1cb525cd61" Sep 29 10:11:50 crc kubenswrapper[4922]: I0929 10:11:50.038176 4922 scope.go:117] "RemoveContainer" containerID="a53e2d45a2a7911f9babbdf279dad97f43b638caf0657eccca37a16a3458a8be" Sep 29 10:11:50 crc kubenswrapper[4922]: I0929 10:11:50.074251 4922 scope.go:117] "RemoveContainer" containerID="0bea57f17394c642e23109b448bdbb1ffe261d1c274d5716f320f14a775b8168" Sep 29 10:11:50 crc kubenswrapper[4922]: I0929 10:11:50.173343 4922 scope.go:117] "RemoveContainer" containerID="1401802f4c7bf771ed6fc6c0336e24190184d7386012ee75c71b6f7682ff4d76" Sep 29 10:11:50 crc kubenswrapper[4922]: I0929 10:11:50.213598 4922 scope.go:117] "RemoveContainer" containerID="6e9d93b9fd6a0f4de8cd71ea385314346b5733cb664ab08be96cfdeb215db6a7" Sep 29 10:11:55 crc kubenswrapper[4922]: I0929 10:11:55.468162 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:11:55 crc kubenswrapper[4922]: E0929 10:11:55.469622 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:11:58 crc kubenswrapper[4922]: I0929 10:11:58.066597 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dtwl5"] Sep 29 10:11:58 crc kubenswrapper[4922]: I0929 10:11:58.081664 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dtwl5"] Sep 29 10:11:59 crc kubenswrapper[4922]: I0929 10:11:59.465074 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed54c52a-229b-45f0-8526-19d6ca42237c" path="/var/lib/kubelet/pods/ed54c52a-229b-45f0-8526-19d6ca42237c/volumes" Sep 29 10:12:00 crc kubenswrapper[4922]: I0929 10:12:00.043162 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7lxww"] Sep 29 10:12:00 crc kubenswrapper[4922]: I0929 10:12:00.053980 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7lxww"] Sep 29 10:12:01 crc kubenswrapper[4922]: I0929 10:12:01.466699 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2d9bba-864b-468d-923e-23cf0544daf9" path="/var/lib/kubelet/pods/0c2d9bba-864b-468d-923e-23cf0544daf9/volumes" Sep 29 10:12:06 crc kubenswrapper[4922]: I0929 10:12:06.451827 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:12:06 crc kubenswrapper[4922]: E0929 10:12:06.453469 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:12:21 crc kubenswrapper[4922]: I0929 10:12:21.453106 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:12:21 crc kubenswrapper[4922]: E0929 10:12:21.454420 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:12:33 crc kubenswrapper[4922]: I0929 10:12:33.451989 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:12:33 crc kubenswrapper[4922]: E0929 10:12:33.452944 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.055013 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dhmn8"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.064500 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dhmn8"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.075731 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bdl79"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.086386 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bdl79"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.096681 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dc5px"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.106890 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dc5px"] Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.466103 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179180d9-2c92-4e80-bf4c-560bfe6e3a69" path="/var/lib/kubelet/pods/179180d9-2c92-4e80-bf4c-560bfe6e3a69/volumes" Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.466652 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3536ac4e-7447-4439-aa09-ef3fce28f84a" path="/var/lib/kubelet/pods/3536ac4e-7447-4439-aa09-ef3fce28f84a/volumes" Sep 29 10:12:43 crc kubenswrapper[4922]: I0929 10:12:43.467176 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac65ee5-a195-4375-a997-c0f5cfea448e" path="/var/lib/kubelet/pods/bac65ee5-a195-4375-a997-c0f5cfea448e/volumes" Sep 29 10:12:44 crc kubenswrapper[4922]: I0929 10:12:44.452049 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:12:44 crc kubenswrapper[4922]: E0929 10:12:44.452851 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:12:50 crc kubenswrapper[4922]: I0929 10:12:50.406105 4922 scope.go:117] "RemoveContainer" containerID="eef47573cc31b3b5a4cc7b1598a121fdb0fc95cdc2c8737b7e2da4edbb3fd519" Sep 29 10:12:50 crc kubenswrapper[4922]: I0929 10:12:50.464701 4922 scope.go:117] "RemoveContainer" containerID="d7282c01419a74f27d0b9accaf14900951a579f6d8a96ffec4afce1fb373401d" Sep 29 10:12:50 crc kubenswrapper[4922]: I0929 10:12:50.491181 4922 scope.go:117] "RemoveContainer" containerID="2ad9df1339618630e3f336a05c1b9feb2487875c208c5ac27898a98a97112b63" Sep 29 10:12:50 crc kubenswrapper[4922]: I0929 10:12:50.540544 4922 scope.go:117] "RemoveContainer" containerID="3fd6d839f8fafd5749397a685dc26ea36c366c517f58c90964d888ca0f8de4ca" Sep 29 10:12:50 crc kubenswrapper[4922]: I0929 10:12:50.601274 4922 scope.go:117] "RemoveContainer" containerID="5bfae74081334f822c7239f7a7142602cfb9475747994858e5fafbd23645e2c4" Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.050797 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2ee9-account-create-g6ttg"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.062110 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b33d-account-create-7wvb5"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.076767 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0045-account-create-p2pzv"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.085645 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b33d-account-create-7wvb5"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.094191 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2ee9-account-create-g6ttg"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.103143 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0045-account-create-p2pzv"] Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.462257 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146980d0-cc5b-44ad-87c7-fd6463f25659" path="/var/lib/kubelet/pods/146980d0-cc5b-44ad-87c7-fd6463f25659/volumes" Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.463199 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c731a0-29c8-476c-98a0-3bf96579183f" path="/var/lib/kubelet/pods/77c731a0-29c8-476c-98a0-3bf96579183f/volumes" Sep 29 10:12:53 crc kubenswrapper[4922]: I0929 10:12:53.463788 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e9ee79-2a19-4e08-9e57-a1d745c7976e" path="/var/lib/kubelet/pods/d8e9ee79-2a19-4e08-9e57-a1d745c7976e/volumes" Sep 29 10:12:56 crc kubenswrapper[4922]: I0929 10:12:56.453122 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:12:56 crc kubenswrapper[4922]: E0929 10:12:56.454091 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:12:57 crc kubenswrapper[4922]: I0929 10:12:57.095383 4922 generic.go:334] "Generic (PLEG): container finished" podID="b7891137-eddd-4865-9a35-f32a72a1f206" containerID="397f4687debd2dbfd05aeb69cac2166ee07cc15314893326d5601b225bdde70e" exitCode=0 Sep 29 10:12:57 crc kubenswrapper[4922]: I0929 10:12:57.095747 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" event={"ID":"b7891137-eddd-4865-9a35-f32a72a1f206","Type":"ContainerDied","Data":"397f4687debd2dbfd05aeb69cac2166ee07cc15314893326d5601b225bdde70e"} Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.564221 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.663775 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp8lf\" (UniqueName: \"kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf\") pod \"b7891137-eddd-4865-9a35-f32a72a1f206\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.663908 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory\") pod \"b7891137-eddd-4865-9a35-f32a72a1f206\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.664009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key\") pod \"b7891137-eddd-4865-9a35-f32a72a1f206\" (UID: \"b7891137-eddd-4865-9a35-f32a72a1f206\") " Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.671235 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf" (OuterVolumeSpecName: "kube-api-access-kp8lf") pod "b7891137-eddd-4865-9a35-f32a72a1f206" (UID: "b7891137-eddd-4865-9a35-f32a72a1f206"). InnerVolumeSpecName "kube-api-access-kp8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.699177 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory" (OuterVolumeSpecName: "inventory") pod "b7891137-eddd-4865-9a35-f32a72a1f206" (UID: "b7891137-eddd-4865-9a35-f32a72a1f206"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.700152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b7891137-eddd-4865-9a35-f32a72a1f206" (UID: "b7891137-eddd-4865-9a35-f32a72a1f206"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.766721 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.766772 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp8lf\" (UniqueName: \"kubernetes.io/projected/b7891137-eddd-4865-9a35-f32a72a1f206-kube-api-access-kp8lf\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:58 crc kubenswrapper[4922]: I0929 10:12:58.766785 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7891137-eddd-4865-9a35-f32a72a1f206-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.119391 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" event={"ID":"b7891137-eddd-4865-9a35-f32a72a1f206","Type":"ContainerDied","Data":"666bc69995e5d874b9d47801c265dcbad00ce94fc93e332bda58688367a06537"} Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.119449 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="666bc69995e5d874b9d47801c265dcbad00ce94fc93e332bda58688367a06537" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.119490 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.228398 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn"] Sep 29 10:12:59 crc kubenswrapper[4922]: E0929 10:12:59.229032 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7891137-eddd-4865-9a35-f32a72a1f206" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.229056 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7891137-eddd-4865-9a35-f32a72a1f206" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.229265 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7891137-eddd-4865-9a35-f32a72a1f206" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.230122 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.236663 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.236724 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.237627 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.237950 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.243082 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn"] Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.277603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.277981 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.278310 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpz5m\" (UniqueName: \"kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.380211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpz5m\" (UniqueName: \"kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.380348 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.380422 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.386215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.387745 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.400127 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpz5m\" (UniqueName: \"kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:12:59 crc kubenswrapper[4922]: I0929 10:12:59.558998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:13:00 crc kubenswrapper[4922]: I0929 10:13:00.128272 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn"] Sep 29 10:13:01 crc kubenswrapper[4922]: I0929 10:13:01.142395 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" event={"ID":"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f","Type":"ContainerStarted","Data":"c45e2024a9e8b652dfba12cbf98f170190c17360a25565083640fa98fcac9e56"} Sep 29 10:13:01 crc kubenswrapper[4922]: I0929 10:13:01.143437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" event={"ID":"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f","Type":"ContainerStarted","Data":"78d9e20c7ac325e2a4021ff83b60935125593a93cd14afccd61868d7534cc72b"} Sep 29 10:13:06 crc kubenswrapper[4922]: I0929 10:13:06.198772 4922 generic.go:334] "Generic (PLEG): container finished" podID="6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" containerID="c45e2024a9e8b652dfba12cbf98f170190c17360a25565083640fa98fcac9e56" exitCode=0 Sep 29 10:13:06 crc kubenswrapper[4922]: I0929 10:13:06.198884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" event={"ID":"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f","Type":"ContainerDied","Data":"c45e2024a9e8b652dfba12cbf98f170190c17360a25565083640fa98fcac9e56"} Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.627953 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.667286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpz5m\" (UniqueName: \"kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m\") pod \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.667531 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key\") pod \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.667712 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory\") pod \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\" (UID: \"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f\") " Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.677878 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m" (OuterVolumeSpecName: "kube-api-access-lpz5m") pod "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" (UID: "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f"). InnerVolumeSpecName "kube-api-access-lpz5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.705387 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory" (OuterVolumeSpecName: "inventory") pod "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" (UID: "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.706003 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" (UID: "6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.769800 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.769849 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:07 crc kubenswrapper[4922]: I0929 10:13:07.769861 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpz5m\" (UniqueName: \"kubernetes.io/projected/6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f-kube-api-access-lpz5m\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.223928 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" event={"ID":"6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f","Type":"ContainerDied","Data":"78d9e20c7ac325e2a4021ff83b60935125593a93cd14afccd61868d7534cc72b"} Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.224389 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d9e20c7ac325e2a4021ff83b60935125593a93cd14afccd61868d7534cc72b" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.224009 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.315943 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn"] Sep 29 10:13:08 crc kubenswrapper[4922]: E0929 10:13:08.316516 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.316537 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.316772 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.317677 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.320291 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.320598 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.320742 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.320956 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.341008 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn"] Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.382505 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrnf\" (UniqueName: \"kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.382618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.383322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.485466 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrnf\" (UniqueName: \"kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.485522 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.486766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.494757 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.495295 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.515570 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrnf\" (UniqueName: \"kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7wdvn\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:08 crc kubenswrapper[4922]: I0929 10:13:08.643671 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:09 crc kubenswrapper[4922]: I0929 10:13:09.233768 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn"] Sep 29 10:13:10 crc kubenswrapper[4922]: I0929 10:13:10.246704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" event={"ID":"425016bd-6178-497e-ad2b-e150d1cf141f","Type":"ContainerStarted","Data":"5884d7f7833f0a757e1167cc60069603241c395ffed6bfbf9cd3ef6ac9fc3eef"} Sep 29 10:13:10 crc kubenswrapper[4922]: I0929 10:13:10.247339 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" event={"ID":"425016bd-6178-497e-ad2b-e150d1cf141f","Type":"ContainerStarted","Data":"484976cbfac9f856f8acaf0776006a9875ea819971eaba279e78d919fb372792"} Sep 29 10:13:10 crc kubenswrapper[4922]: I0929 10:13:10.270927 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" podStartSLOduration=1.849730629 podStartE2EDuration="2.270895232s" podCreationTimestamp="2025-09-29 10:13:08 +0000 UTC" firstStartedPulling="2025-09-29 10:13:09.246322311 +0000 UTC m=+1714.612552595" lastFinishedPulling="2025-09-29 10:13:09.667486924 +0000 UTC m=+1715.033717198" observedRunningTime="2025-09-29 10:13:10.269416873 +0000 UTC m=+1715.635647137" watchObservedRunningTime="2025-09-29 10:13:10.270895232 +0000 UTC m=+1715.637125506" Sep 29 10:13:10 crc kubenswrapper[4922]: I0929 10:13:10.452688 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:13:10 crc kubenswrapper[4922]: E0929 10:13:10.453117 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:13:22 crc kubenswrapper[4922]: I0929 10:13:22.452985 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:13:22 crc kubenswrapper[4922]: E0929 10:13:22.453940 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:13:27 crc kubenswrapper[4922]: I0929 10:13:27.058188 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cfqkk"] Sep 29 10:13:27 crc kubenswrapper[4922]: I0929 10:13:27.068563 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cfqkk"] Sep 29 10:13:27 crc kubenswrapper[4922]: I0929 10:13:27.464036 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a157db-ed95-45a6-9e10-acad67ba9e0f" path="/var/lib/kubelet/pods/e1a157db-ed95-45a6-9e10-acad67ba9e0f/volumes" Sep 29 10:13:35 crc kubenswrapper[4922]: I0929 10:13:35.463175 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:13:35 crc kubenswrapper[4922]: E0929 10:13:35.464935 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:13:46 crc kubenswrapper[4922]: I0929 10:13:46.042126 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8fjc9"] Sep 29 10:13:46 crc kubenswrapper[4922]: I0929 10:13:46.052399 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8fjc9"] Sep 29 10:13:46 crc kubenswrapper[4922]: I0929 10:13:46.452549 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:13:46 crc kubenswrapper[4922]: E0929 10:13:46.453102 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:13:47 crc kubenswrapper[4922]: I0929 10:13:47.477070 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93689363-9408-4bc9-b502-0471871ff5ba" path="/var/lib/kubelet/pods/93689363-9408-4bc9-b502-0471871ff5ba/volumes" Sep 29 10:13:47 crc kubenswrapper[4922]: I0929 10:13:47.663978 4922 generic.go:334] "Generic (PLEG): container finished" podID="425016bd-6178-497e-ad2b-e150d1cf141f" containerID="5884d7f7833f0a757e1167cc60069603241c395ffed6bfbf9cd3ef6ac9fc3eef" exitCode=0 Sep 29 10:13:47 crc kubenswrapper[4922]: I0929 10:13:47.664045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" event={"ID":"425016bd-6178-497e-ad2b-e150d1cf141f","Type":"ContainerDied","Data":"5884d7f7833f0a757e1167cc60069603241c395ffed6bfbf9cd3ef6ac9fc3eef"} Sep 29 10:13:48 crc kubenswrapper[4922]: I0929 10:13:48.040552 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85szq"] Sep 29 10:13:48 crc kubenswrapper[4922]: I0929 10:13:48.054188 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-85szq"] Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.152709 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.249957 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key\") pod \"425016bd-6178-497e-ad2b-e150d1cf141f\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.250234 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrnf\" (UniqueName: \"kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf\") pod \"425016bd-6178-497e-ad2b-e150d1cf141f\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.250527 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory\") pod \"425016bd-6178-497e-ad2b-e150d1cf141f\" (UID: \"425016bd-6178-497e-ad2b-e150d1cf141f\") " Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.260769 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf" (OuterVolumeSpecName: "kube-api-access-fxrnf") pod "425016bd-6178-497e-ad2b-e150d1cf141f" (UID: "425016bd-6178-497e-ad2b-e150d1cf141f"). InnerVolumeSpecName "kube-api-access-fxrnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.285567 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory" (OuterVolumeSpecName: "inventory") pod "425016bd-6178-497e-ad2b-e150d1cf141f" (UID: "425016bd-6178-497e-ad2b-e150d1cf141f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.287778 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "425016bd-6178-497e-ad2b-e150d1cf141f" (UID: "425016bd-6178-497e-ad2b-e150d1cf141f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.355437 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrnf\" (UniqueName: \"kubernetes.io/projected/425016bd-6178-497e-ad2b-e150d1cf141f-kube-api-access-fxrnf\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.355943 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.356032 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425016bd-6178-497e-ad2b-e150d1cf141f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.465436 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7807d04e-1d92-4727-9cad-6504967c92ad" path="/var/lib/kubelet/pods/7807d04e-1d92-4727-9cad-6504967c92ad/volumes" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.683545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" event={"ID":"425016bd-6178-497e-ad2b-e150d1cf141f","Type":"ContainerDied","Data":"484976cbfac9f856f8acaf0776006a9875ea819971eaba279e78d919fb372792"} Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.683597 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484976cbfac9f856f8acaf0776006a9875ea819971eaba279e78d919fb372792" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.683738 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7wdvn" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.796551 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc"] Sep 29 10:13:49 crc kubenswrapper[4922]: E0929 10:13:49.797209 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425016bd-6178-497e-ad2b-e150d1cf141f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.797246 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="425016bd-6178-497e-ad2b-e150d1cf141f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.797584 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="425016bd-6178-497e-ad2b-e150d1cf141f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.798589 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.804685 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.804714 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.806452 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.807218 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.814343 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc"] Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.870860 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.871534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fs7t\" (UniqueName: \"kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.871586 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.973870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.974002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fs7t\" (UniqueName: \"kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.974040 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.978261 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.985302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:49 crc kubenswrapper[4922]: I0929 10:13:49.992472 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fs7t\" (UniqueName: \"kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.119395 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.686727 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc"] Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.793682 4922 scope.go:117] "RemoveContainer" containerID="0f7f359fbce576e1fdb0dec2904402546251b18c10b6d1b01d139a3a7832b46f" Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.836043 4922 scope.go:117] "RemoveContainer" containerID="1ca57c8afc1ea282b4480c13a992e9caffcc1f5421c18914065e59c8ea52172c" Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.907920 4922 scope.go:117] "RemoveContainer" containerID="bce7dc1844ea1b95e0b5e90c68f9469df1ab8d2094dbe5818fad88cae3d68079" Sep 29 10:13:50 crc kubenswrapper[4922]: I0929 10:13:50.935563 4922 scope.go:117] "RemoveContainer" containerID="80d6d14c1fefb18655c259c0a92b49b4159e49ad90f7ce983642396bea4a7655" Sep 29 10:13:51 crc kubenswrapper[4922]: I0929 10:13:50.998924 4922 scope.go:117] "RemoveContainer" containerID="b7be3d7d3a50aabc809cfb0b5f4b2241e494282d6fefd0ef2e6316ce4e222470" Sep 29 10:13:51 crc kubenswrapper[4922]: I0929 10:13:51.052362 4922 scope.go:117] "RemoveContainer" containerID="8832034619e8459a41e5045254a8e0b2567e755ef13b91f024e859cb9fd0c84e" Sep 29 10:13:51 crc kubenswrapper[4922]: I0929 10:13:51.714297 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" event={"ID":"8653f711-4f91-4ce3-a900-95aa54ac26a1","Type":"ContainerStarted","Data":"0ad8a1e5bfde32713b7a3d30b2b32011c0f5463ece1aa1363a0e0d37e8c18d64"} Sep 29 10:13:51 crc kubenswrapper[4922]: I0929 10:13:51.714807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" event={"ID":"8653f711-4f91-4ce3-a900-95aa54ac26a1","Type":"ContainerStarted","Data":"6ba168076702e1180c7b05541c6650183a36272d0e5dc7cc697077d04612ccb1"} Sep 29 10:13:51 crc kubenswrapper[4922]: I0929 10:13:51.745685 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" podStartSLOduration=2.15857931 podStartE2EDuration="2.745661609s" podCreationTimestamp="2025-09-29 10:13:49 +0000 UTC" firstStartedPulling="2025-09-29 10:13:50.697251832 +0000 UTC m=+1756.063482096" lastFinishedPulling="2025-09-29 10:13:51.284334131 +0000 UTC m=+1756.650564395" observedRunningTime="2025-09-29 10:13:51.735765327 +0000 UTC m=+1757.101995591" watchObservedRunningTime="2025-09-29 10:13:51.745661609 +0000 UTC m=+1757.111891873" Sep 29 10:13:58 crc kubenswrapper[4922]: I0929 10:13:58.452095 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:13:58 crc kubenswrapper[4922]: E0929 10:13:58.453551 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:14:11 crc kubenswrapper[4922]: I0929 10:14:11.453052 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:14:11 crc kubenswrapper[4922]: E0929 10:14:11.454288 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:14:23 crc kubenswrapper[4922]: I0929 10:14:23.451620 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:14:23 crc kubenswrapper[4922]: E0929 10:14:23.452490 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:14:31 crc kubenswrapper[4922]: I0929 10:14:31.099676 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-r6vhs"] Sep 29 10:14:31 crc kubenswrapper[4922]: I0929 10:14:31.116821 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-r6vhs"] Sep 29 10:14:31 crc kubenswrapper[4922]: I0929 10:14:31.466936 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9964cee5-67a1-4a42-84e3-3586ed6c3457" path="/var/lib/kubelet/pods/9964cee5-67a1-4a42-84e3-3586ed6c3457/volumes" Sep 29 10:14:37 crc kubenswrapper[4922]: I0929 10:14:37.452540 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:14:38 crc kubenswrapper[4922]: I0929 10:14:38.247095 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f"} Sep 29 10:14:43 crc kubenswrapper[4922]: I0929 10:14:43.301709 4922 generic.go:334] "Generic (PLEG): container finished" podID="8653f711-4f91-4ce3-a900-95aa54ac26a1" containerID="0ad8a1e5bfde32713b7a3d30b2b32011c0f5463ece1aa1363a0e0d37e8c18d64" exitCode=0 Sep 29 10:14:43 crc kubenswrapper[4922]: I0929 10:14:43.301933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" event={"ID":"8653f711-4f91-4ce3-a900-95aa54ac26a1","Type":"ContainerDied","Data":"0ad8a1e5bfde32713b7a3d30b2b32011c0f5463ece1aa1363a0e0d37e8c18d64"} Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.799738 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.876376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fs7t\" (UniqueName: \"kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t\") pod \"8653f711-4f91-4ce3-a900-95aa54ac26a1\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.876420 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key\") pod \"8653f711-4f91-4ce3-a900-95aa54ac26a1\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.876509 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory\") pod \"8653f711-4f91-4ce3-a900-95aa54ac26a1\" (UID: \"8653f711-4f91-4ce3-a900-95aa54ac26a1\") " Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.896301 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t" (OuterVolumeSpecName: "kube-api-access-9fs7t") pod "8653f711-4f91-4ce3-a900-95aa54ac26a1" (UID: "8653f711-4f91-4ce3-a900-95aa54ac26a1"). InnerVolumeSpecName "kube-api-access-9fs7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.911934 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory" (OuterVolumeSpecName: "inventory") pod "8653f711-4f91-4ce3-a900-95aa54ac26a1" (UID: "8653f711-4f91-4ce3-a900-95aa54ac26a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.914801 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8653f711-4f91-4ce3-a900-95aa54ac26a1" (UID: "8653f711-4f91-4ce3-a900-95aa54ac26a1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.977670 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fs7t\" (UniqueName: \"kubernetes.io/projected/8653f711-4f91-4ce3-a900-95aa54ac26a1-kube-api-access-9fs7t\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.977717 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:44 crc kubenswrapper[4922]: I0929 10:14:44.977726 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8653f711-4f91-4ce3-a900-95aa54ac26a1-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.323420 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" event={"ID":"8653f711-4f91-4ce3-a900-95aa54ac26a1","Type":"ContainerDied","Data":"6ba168076702e1180c7b05541c6650183a36272d0e5dc7cc697077d04612ccb1"} Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.323492 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba168076702e1180c7b05541c6650183a36272d0e5dc7cc697077d04612ccb1" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.323969 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.426447 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ntxlw"] Sep 29 10:14:45 crc kubenswrapper[4922]: E0929 10:14:45.426924 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8653f711-4f91-4ce3-a900-95aa54ac26a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.426942 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8653f711-4f91-4ce3-a900-95aa54ac26a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.427166 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8653f711-4f91-4ce3-a900-95aa54ac26a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.428129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.433505 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.434765 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.435061 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.441821 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ntxlw"] Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.443431 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.490958 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kwlm\" (UniqueName: \"kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.491110 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.491236 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.593899 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kwlm\" (UniqueName: \"kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.594408 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.594569 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.600489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.604455 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.616254 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kwlm\" (UniqueName: \"kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm\") pod \"ssh-known-hosts-edpm-deployment-ntxlw\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:45 crc kubenswrapper[4922]: I0929 10:14:45.760051 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:46 crc kubenswrapper[4922]: I0929 10:14:46.397533 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ntxlw"] Sep 29 10:14:47 crc kubenswrapper[4922]: I0929 10:14:47.351217 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" event={"ID":"b579a838-93e1-47ac-8069-b49e76d8d630","Type":"ContainerStarted","Data":"91c1ec271ac2498d2544a97b1f4fc6cab5d4c9ea80a5b1476331be6938c7db8c"} Sep 29 10:14:48 crc kubenswrapper[4922]: I0929 10:14:48.365017 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" event={"ID":"b579a838-93e1-47ac-8069-b49e76d8d630","Type":"ContainerStarted","Data":"c96311375a85a53d5dac829ac300b61fc96f0c9d9b16037fcb39ed5584c11711"} Sep 29 10:14:48 crc kubenswrapper[4922]: I0929 10:14:48.388740 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" podStartSLOduration=2.655481578 podStartE2EDuration="3.388720939s" podCreationTimestamp="2025-09-29 10:14:45 +0000 UTC" firstStartedPulling="2025-09-29 10:14:46.408575096 +0000 UTC m=+1811.774805360" lastFinishedPulling="2025-09-29 10:14:47.141814457 +0000 UTC m=+1812.508044721" observedRunningTime="2025-09-29 10:14:48.384326917 +0000 UTC m=+1813.750557201" watchObservedRunningTime="2025-09-29 10:14:48.388720939 +0000 UTC m=+1813.754951203" Sep 29 10:14:51 crc kubenswrapper[4922]: I0929 10:14:51.199040 4922 scope.go:117] "RemoveContainer" containerID="bd77e9f0fab3b7e75cd86e2e597efc8afbfdf6fdf0dbda2d42afdf414dd6bb5d" Sep 29 10:14:54 crc kubenswrapper[4922]: I0929 10:14:54.434236 4922 generic.go:334] "Generic (PLEG): container finished" podID="b579a838-93e1-47ac-8069-b49e76d8d630" containerID="c96311375a85a53d5dac829ac300b61fc96f0c9d9b16037fcb39ed5584c11711" exitCode=0 Sep 29 10:14:54 crc kubenswrapper[4922]: I0929 10:14:54.434314 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" event={"ID":"b579a838-93e1-47ac-8069-b49e76d8d630","Type":"ContainerDied","Data":"c96311375a85a53d5dac829ac300b61fc96f0c9d9b16037fcb39ed5584c11711"} Sep 29 10:14:55 crc kubenswrapper[4922]: I0929 10:14:55.912456 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.053103 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0\") pod \"b579a838-93e1-47ac-8069-b49e76d8d630\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.053200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam\") pod \"b579a838-93e1-47ac-8069-b49e76d8d630\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.053470 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kwlm\" (UniqueName: \"kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm\") pod \"b579a838-93e1-47ac-8069-b49e76d8d630\" (UID: \"b579a838-93e1-47ac-8069-b49e76d8d630\") " Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.061240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm" (OuterVolumeSpecName: "kube-api-access-2kwlm") pod "b579a838-93e1-47ac-8069-b49e76d8d630" (UID: "b579a838-93e1-47ac-8069-b49e76d8d630"). InnerVolumeSpecName "kube-api-access-2kwlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.086383 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b579a838-93e1-47ac-8069-b49e76d8d630" (UID: "b579a838-93e1-47ac-8069-b49e76d8d630"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.094611 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b579a838-93e1-47ac-8069-b49e76d8d630" (UID: "b579a838-93e1-47ac-8069-b49e76d8d630"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.156494 4922 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.156538 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b579a838-93e1-47ac-8069-b49e76d8d630-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.156553 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kwlm\" (UniqueName: \"kubernetes.io/projected/b579a838-93e1-47ac-8069-b49e76d8d630-kube-api-access-2kwlm\") on node \"crc\" DevicePath \"\"" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.463874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" event={"ID":"b579a838-93e1-47ac-8069-b49e76d8d630","Type":"ContainerDied","Data":"91c1ec271ac2498d2544a97b1f4fc6cab5d4c9ea80a5b1476331be6938c7db8c"} Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.464434 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c1ec271ac2498d2544a97b1f4fc6cab5d4c9ea80a5b1476331be6938c7db8c" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.463947 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ntxlw" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.627498 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn"] Sep 29 10:14:56 crc kubenswrapper[4922]: E0929 10:14:56.627982 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b579a838-93e1-47ac-8069-b49e76d8d630" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.628002 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b579a838-93e1-47ac-8069-b49e76d8d630" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.628222 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b579a838-93e1-47ac-8069-b49e76d8d630" containerName="ssh-known-hosts-edpm-deployment" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.628988 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.631617 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.632025 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.632027 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.632477 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.655309 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn"] Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.769095 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.769559 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.769795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rswj\" (UniqueName: \"kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.871664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.872117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.872259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rswj\" (UniqueName: \"kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.877954 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.882024 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.907219 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rswj\" (UniqueName: \"kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kfbbn\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:56 crc kubenswrapper[4922]: I0929 10:14:56.950644 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:14:57 crc kubenswrapper[4922]: I0929 10:14:57.528253 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn"] Sep 29 10:14:58 crc kubenswrapper[4922]: I0929 10:14:58.488334 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" event={"ID":"3ae2127d-25fd-4296-9143-1f12b7ffd0c2","Type":"ContainerStarted","Data":"1eb7228f11b16a9f84b0025c00e5d2007a642be4489db69fbba59d9fb5e687e7"} Sep 29 10:14:58 crc kubenswrapper[4922]: I0929 10:14:58.489361 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" event={"ID":"3ae2127d-25fd-4296-9143-1f12b7ffd0c2","Type":"ContainerStarted","Data":"a96f74c55aa751c631ae8c2690352e44de5c23053a1033c708d0f5a226c65fb3"} Sep 29 10:14:58 crc kubenswrapper[4922]: I0929 10:14:58.513129 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" podStartSLOduration=2.103829589 podStartE2EDuration="2.513098761s" podCreationTimestamp="2025-09-29 10:14:56 +0000 UTC" firstStartedPulling="2025-09-29 10:14:57.527061932 +0000 UTC m=+1822.893292196" lastFinishedPulling="2025-09-29 10:14:57.936331094 +0000 UTC m=+1823.302561368" observedRunningTime="2025-09-29 10:14:58.509219232 +0000 UTC m=+1823.875449506" watchObservedRunningTime="2025-09-29 10:14:58.513098761 +0000 UTC m=+1823.879329025" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.143492 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284"] Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.145282 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.148512 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.148533 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.163198 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284"] Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.251841 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.251936 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5vd\" (UniqueName: \"kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.252037 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.353559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.353663 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.353729 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5vd\" (UniqueName: \"kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.355007 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.361318 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.381298 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5vd\" (UniqueName: \"kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd\") pod \"collect-profiles-29319015-2q284\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:00 crc kubenswrapper[4922]: I0929 10:15:00.472223 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:01 crc kubenswrapper[4922]: I0929 10:15:01.002349 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284"] Sep 29 10:15:01 crc kubenswrapper[4922]: I0929 10:15:01.523175 4922 generic.go:334] "Generic (PLEG): container finished" podID="115add18-d48d-4afa-9350-24a71beb29c2" containerID="5676fdb58639b0e23d13d08c939d49f85407ab60e136237ba20963bc8757c248" exitCode=0 Sep 29 10:15:01 crc kubenswrapper[4922]: I0929 10:15:01.523329 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" event={"ID":"115add18-d48d-4afa-9350-24a71beb29c2","Type":"ContainerDied","Data":"5676fdb58639b0e23d13d08c939d49f85407ab60e136237ba20963bc8757c248"} Sep 29 10:15:01 crc kubenswrapper[4922]: I0929 10:15:01.523905 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" event={"ID":"115add18-d48d-4afa-9350-24a71beb29c2","Type":"ContainerStarted","Data":"c56a3422fab9f63d42eeabe09640e00677b2abe07ce6301685445f5948dda66b"} Sep 29 10:15:02 crc kubenswrapper[4922]: I0929 10:15:02.870142 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.011195 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume\") pod \"115add18-d48d-4afa-9350-24a71beb29c2\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.011462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq5vd\" (UniqueName: \"kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd\") pod \"115add18-d48d-4afa-9350-24a71beb29c2\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.011573 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume\") pod \"115add18-d48d-4afa-9350-24a71beb29c2\" (UID: \"115add18-d48d-4afa-9350-24a71beb29c2\") " Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.012411 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "115add18-d48d-4afa-9350-24a71beb29c2" (UID: "115add18-d48d-4afa-9350-24a71beb29c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.019914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd" (OuterVolumeSpecName: "kube-api-access-pq5vd") pod "115add18-d48d-4afa-9350-24a71beb29c2" (UID: "115add18-d48d-4afa-9350-24a71beb29c2"). InnerVolumeSpecName "kube-api-access-pq5vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.020526 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "115add18-d48d-4afa-9350-24a71beb29c2" (UID: "115add18-d48d-4afa-9350-24a71beb29c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.114636 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/115add18-d48d-4afa-9350-24a71beb29c2-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.114687 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/115add18-d48d-4afa-9350-24a71beb29c2-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.114704 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq5vd\" (UniqueName: \"kubernetes.io/projected/115add18-d48d-4afa-9350-24a71beb29c2-kube-api-access-pq5vd\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.543247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" event={"ID":"115add18-d48d-4afa-9350-24a71beb29c2","Type":"ContainerDied","Data":"c56a3422fab9f63d42eeabe09640e00677b2abe07ce6301685445f5948dda66b"} Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.543319 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56a3422fab9f63d42eeabe09640e00677b2abe07ce6301685445f5948dda66b" Sep 29 10:15:03 crc kubenswrapper[4922]: I0929 10:15:03.543280 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319015-2q284" Sep 29 10:15:06 crc kubenswrapper[4922]: I0929 10:15:06.577217 4922 generic.go:334] "Generic (PLEG): container finished" podID="3ae2127d-25fd-4296-9143-1f12b7ffd0c2" containerID="1eb7228f11b16a9f84b0025c00e5d2007a642be4489db69fbba59d9fb5e687e7" exitCode=0 Sep 29 10:15:06 crc kubenswrapper[4922]: I0929 10:15:06.577318 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" event={"ID":"3ae2127d-25fd-4296-9143-1f12b7ffd0c2","Type":"ContainerDied","Data":"1eb7228f11b16a9f84b0025c00e5d2007a642be4489db69fbba59d9fb5e687e7"} Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.041729 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.226061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rswj\" (UniqueName: \"kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj\") pod \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.226123 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key\") pod \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.226290 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory\") pod \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\" (UID: \"3ae2127d-25fd-4296-9143-1f12b7ffd0c2\") " Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.236301 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj" (OuterVolumeSpecName: "kube-api-access-5rswj") pod "3ae2127d-25fd-4296-9143-1f12b7ffd0c2" (UID: "3ae2127d-25fd-4296-9143-1f12b7ffd0c2"). InnerVolumeSpecName "kube-api-access-5rswj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.258309 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ae2127d-25fd-4296-9143-1f12b7ffd0c2" (UID: "3ae2127d-25fd-4296-9143-1f12b7ffd0c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.263461 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory" (OuterVolumeSpecName: "inventory") pod "3ae2127d-25fd-4296-9143-1f12b7ffd0c2" (UID: "3ae2127d-25fd-4296-9143-1f12b7ffd0c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.329779 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rswj\" (UniqueName: \"kubernetes.io/projected/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-kube-api-access-5rswj\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.330252 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.330271 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae2127d-25fd-4296-9143-1f12b7ffd0c2-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.602243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" event={"ID":"3ae2127d-25fd-4296-9143-1f12b7ffd0c2","Type":"ContainerDied","Data":"a96f74c55aa751c631ae8c2690352e44de5c23053a1033c708d0f5a226c65fb3"} Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.602297 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kfbbn" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.602300 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a96f74c55aa751c631ae8c2690352e44de5c23053a1033c708d0f5a226c65fb3" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.708075 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr"] Sep 29 10:15:08 crc kubenswrapper[4922]: E0929 10:15:08.709488 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115add18-d48d-4afa-9350-24a71beb29c2" containerName="collect-profiles" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.709516 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="115add18-d48d-4afa-9350-24a71beb29c2" containerName="collect-profiles" Sep 29 10:15:08 crc kubenswrapper[4922]: E0929 10:15:08.709538 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae2127d-25fd-4296-9143-1f12b7ffd0c2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.709546 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae2127d-25fd-4296-9143-1f12b7ffd0c2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.709729 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae2127d-25fd-4296-9143-1f12b7ffd0c2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.709757 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="115add18-d48d-4afa-9350-24a71beb29c2" containerName="collect-profiles" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.710949 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.713023 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.713165 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.713500 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.723584 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.739948 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr"] Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.840259 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.840327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpb96\" (UniqueName: \"kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.840811 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.943309 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.943404 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpb96\" (UniqueName: \"kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.943690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.950391 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.951906 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:08 crc kubenswrapper[4922]: I0929 10:15:08.965365 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpb96\" (UniqueName: \"kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:09 crc kubenswrapper[4922]: I0929 10:15:09.035144 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:09 crc kubenswrapper[4922]: I0929 10:15:09.651908 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr"] Sep 29 10:15:10 crc kubenswrapper[4922]: I0929 10:15:10.631548 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" event={"ID":"2bb4b88d-fc96-488b-a144-7f524d2cd1e7","Type":"ContainerStarted","Data":"2f9485ba38efcc93180de93941e5ebfa6f11470d3287d6c907e94e76be742b46"} Sep 29 10:15:10 crc kubenswrapper[4922]: I0929 10:15:10.632120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" event={"ID":"2bb4b88d-fc96-488b-a144-7f524d2cd1e7","Type":"ContainerStarted","Data":"a5e28729f3a6449dc8e4097d25ca8a21009132bf4ac17af62755559ffeb55312"} Sep 29 10:15:10 crc kubenswrapper[4922]: I0929 10:15:10.658551 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" podStartSLOduration=2.21905794 podStartE2EDuration="2.65850492s" podCreationTimestamp="2025-09-29 10:15:08 +0000 UTC" firstStartedPulling="2025-09-29 10:15:09.666256913 +0000 UTC m=+1835.032487177" lastFinishedPulling="2025-09-29 10:15:10.105703893 +0000 UTC m=+1835.471934157" observedRunningTime="2025-09-29 10:15:10.650461485 +0000 UTC m=+1836.016691749" watchObservedRunningTime="2025-09-29 10:15:10.65850492 +0000 UTC m=+1836.024735184" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.036021 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.039709 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.054278 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.146176 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.146316 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.146369 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlklb\" (UniqueName: \"kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.249156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlklb\" (UniqueName: \"kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.249288 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.249369 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.249958 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.249995 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.273307 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlklb\" (UniqueName: \"kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb\") pod \"community-operators-5wxjh\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.362165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:18 crc kubenswrapper[4922]: I0929 10:15:18.924966 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:19 crc kubenswrapper[4922]: I0929 10:15:19.727031 4922 generic.go:334] "Generic (PLEG): container finished" podID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerID="596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790" exitCode=0 Sep 29 10:15:19 crc kubenswrapper[4922]: I0929 10:15:19.727134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerDied","Data":"596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790"} Sep 29 10:15:19 crc kubenswrapper[4922]: I0929 10:15:19.727624 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerStarted","Data":"412e9468d35641905a2b3c8ea92d82c446cfb11f7db89384d7e16eefb567370e"} Sep 29 10:15:20 crc kubenswrapper[4922]: I0929 10:15:20.739912 4922 generic.go:334] "Generic (PLEG): container finished" podID="2bb4b88d-fc96-488b-a144-7f524d2cd1e7" containerID="2f9485ba38efcc93180de93941e5ebfa6f11470d3287d6c907e94e76be742b46" exitCode=0 Sep 29 10:15:20 crc kubenswrapper[4922]: I0929 10:15:20.740592 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" event={"ID":"2bb4b88d-fc96-488b-a144-7f524d2cd1e7","Type":"ContainerDied","Data":"2f9485ba38efcc93180de93941e5ebfa6f11470d3287d6c907e94e76be742b46"} Sep 29 10:15:20 crc kubenswrapper[4922]: I0929 10:15:20.744119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerStarted","Data":"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41"} Sep 29 10:15:21 crc kubenswrapper[4922]: E0929 10:15:21.031347 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911aaa77_9ebf_4869_9cc6_08db1685a9f1.slice/crio-conmon-f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911aaa77_9ebf_4869_9cc6_08db1685a9f1.slice/crio-f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:15:21 crc kubenswrapper[4922]: I0929 10:15:21.764390 4922 generic.go:334] "Generic (PLEG): container finished" podID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerID="f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41" exitCode=0 Sep 29 10:15:21 crc kubenswrapper[4922]: I0929 10:15:21.764526 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerDied","Data":"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41"} Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.359876 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.464069 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory\") pod \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.464218 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpb96\" (UniqueName: \"kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96\") pod \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.464478 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key\") pod \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\" (UID: \"2bb4b88d-fc96-488b-a144-7f524d2cd1e7\") " Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.484712 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96" (OuterVolumeSpecName: "kube-api-access-hpb96") pod "2bb4b88d-fc96-488b-a144-7f524d2cd1e7" (UID: "2bb4b88d-fc96-488b-a144-7f524d2cd1e7"). InnerVolumeSpecName "kube-api-access-hpb96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.497440 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2bb4b88d-fc96-488b-a144-7f524d2cd1e7" (UID: "2bb4b88d-fc96-488b-a144-7f524d2cd1e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.499864 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory" (OuterVolumeSpecName: "inventory") pod "2bb4b88d-fc96-488b-a144-7f524d2cd1e7" (UID: "2bb4b88d-fc96-488b-a144-7f524d2cd1e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.566802 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpb96\" (UniqueName: \"kubernetes.io/projected/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-kube-api-access-hpb96\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.567386 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.567402 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb4b88d-fc96-488b-a144-7f524d2cd1e7-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.780401 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerStarted","Data":"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f"} Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.783262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" event={"ID":"2bb4b88d-fc96-488b-a144-7f524d2cd1e7","Type":"ContainerDied","Data":"a5e28729f3a6449dc8e4097d25ca8a21009132bf4ac17af62755559ffeb55312"} Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.783308 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5e28729f3a6449dc8e4097d25ca8a21009132bf4ac17af62755559ffeb55312" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.783371 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.832959 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wxjh" podStartSLOduration=2.363364508 podStartE2EDuration="4.832933834s" podCreationTimestamp="2025-09-29 10:15:18 +0000 UTC" firstStartedPulling="2025-09-29 10:15:19.729633171 +0000 UTC m=+1845.095863475" lastFinishedPulling="2025-09-29 10:15:22.199202537 +0000 UTC m=+1847.565432801" observedRunningTime="2025-09-29 10:15:22.807748103 +0000 UTC m=+1848.173978367" watchObservedRunningTime="2025-09-29 10:15:22.832933834 +0000 UTC m=+1848.199164098" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.894628 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24"] Sep 29 10:15:22 crc kubenswrapper[4922]: E0929 10:15:22.895444 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4b88d-fc96-488b-a144-7f524d2cd1e7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.895565 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4b88d-fc96-488b-a144-7f524d2cd1e7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.895873 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4b88d-fc96-488b-a144-7f524d2cd1e7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.896764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.906309 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24"] Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.910640 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.910861 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.910984 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.911095 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.911211 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.911337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.911450 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.912090 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.980920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.980993 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981156 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981290 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981544 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981708 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981753 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92t96\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981850 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.981992 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.982088 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.982122 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:22 crc kubenswrapper[4922]: I0929 10:15:22.982365 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.085101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.085623 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92t96\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.085748 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.085874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.085982 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086240 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086359 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086630 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.086957 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.087091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.087242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.093146 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.093698 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.093949 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.094462 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.095696 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.095893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.096989 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.098280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.098491 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.098816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.098855 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.099783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.100508 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.108756 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92t96\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5sq24\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.261726 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:15:23 crc kubenswrapper[4922]: I0929 10:15:23.863116 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24"] Sep 29 10:15:23 crc kubenswrapper[4922]: W0929 10:15:23.871508 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69265aa_752a_4d25_9af4_6dd389d13e8a.slice/crio-1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085 WatchSource:0}: Error finding container 1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085: Status 404 returned error can't find the container with id 1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085 Sep 29 10:15:24 crc kubenswrapper[4922]: I0929 10:15:24.805703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" event={"ID":"d69265aa-752a-4d25-9af4-6dd389d13e8a","Type":"ContainerStarted","Data":"0291c3ba3281c9e38e32fc492d5c085fc08bfeaec15dea961adacd8b11362478"} Sep 29 10:15:24 crc kubenswrapper[4922]: I0929 10:15:24.806217 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" event={"ID":"d69265aa-752a-4d25-9af4-6dd389d13e8a","Type":"ContainerStarted","Data":"1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085"} Sep 29 10:15:24 crc kubenswrapper[4922]: I0929 10:15:24.854235 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" podStartSLOduration=2.409884681 podStartE2EDuration="2.854199805s" podCreationTimestamp="2025-09-29 10:15:22 +0000 UTC" firstStartedPulling="2025-09-29 10:15:23.874334463 +0000 UTC m=+1849.240564727" lastFinishedPulling="2025-09-29 10:15:24.318649587 +0000 UTC m=+1849.684879851" observedRunningTime="2025-09-29 10:15:24.842012855 +0000 UTC m=+1850.208243139" watchObservedRunningTime="2025-09-29 10:15:24.854199805 +0000 UTC m=+1850.220430109" Sep 29 10:15:28 crc kubenswrapper[4922]: I0929 10:15:28.362537 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:28 crc kubenswrapper[4922]: I0929 10:15:28.363137 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:28 crc kubenswrapper[4922]: I0929 10:15:28.423700 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:28 crc kubenswrapper[4922]: I0929 10:15:28.911883 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:28 crc kubenswrapper[4922]: I0929 10:15:28.974820 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:30 crc kubenswrapper[4922]: I0929 10:15:30.888016 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5wxjh" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="registry-server" containerID="cri-o://f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f" gracePeriod=2 Sep 29 10:15:31 crc kubenswrapper[4922]: E0929 10:15:31.381149 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911aaa77_9ebf_4869_9cc6_08db1685a9f1.slice/crio-conmon-f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f.scope\": RecentStats: unable to find data in memory cache]" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.443356 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.510596 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities\") pod \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.510978 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content\") pod \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.511211 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlklb\" (UniqueName: \"kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb\") pod \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\" (UID: \"911aaa77-9ebf-4869-9cc6-08db1685a9f1\") " Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.513634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities" (OuterVolumeSpecName: "utilities") pod "911aaa77-9ebf-4869-9cc6-08db1685a9f1" (UID: "911aaa77-9ebf-4869-9cc6-08db1685a9f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.527461 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb" (OuterVolumeSpecName: "kube-api-access-nlklb") pod "911aaa77-9ebf-4869-9cc6-08db1685a9f1" (UID: "911aaa77-9ebf-4869-9cc6-08db1685a9f1"). InnerVolumeSpecName "kube-api-access-nlklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.615100 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlklb\" (UniqueName: \"kubernetes.io/projected/911aaa77-9ebf-4869-9cc6-08db1685a9f1-kube-api-access-nlklb\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.615139 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.803467 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "911aaa77-9ebf-4869-9cc6-08db1685a9f1" (UID: "911aaa77-9ebf-4869-9cc6-08db1685a9f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.823545 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/911aaa77-9ebf-4869-9cc6-08db1685a9f1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.907610 4922 generic.go:334] "Generic (PLEG): container finished" podID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerID="f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f" exitCode=0 Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.907696 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerDied","Data":"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f"} Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.907755 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wxjh" event={"ID":"911aaa77-9ebf-4869-9cc6-08db1685a9f1","Type":"ContainerDied","Data":"412e9468d35641905a2b3c8ea92d82c446cfb11f7db89384d7e16eefb567370e"} Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.907798 4922 scope.go:117] "RemoveContainer" containerID="f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.908128 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wxjh" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.950400 4922 scope.go:117] "RemoveContainer" containerID="f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41" Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.957633 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.971370 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5wxjh"] Sep 29 10:15:31 crc kubenswrapper[4922]: I0929 10:15:31.988631 4922 scope.go:117] "RemoveContainer" containerID="596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.038307 4922 scope.go:117] "RemoveContainer" containerID="f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f" Sep 29 10:15:32 crc kubenswrapper[4922]: E0929 10:15:32.038889 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f\": container with ID starting with f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f not found: ID does not exist" containerID="f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.038937 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f"} err="failed to get container status \"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f\": rpc error: code = NotFound desc = could not find container \"f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f\": container with ID starting with f54907542b38c46043d4fa0c02ed514a9c63382ede93930b36936ba7c6095f3f not found: ID does not exist" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.038967 4922 scope.go:117] "RemoveContainer" containerID="f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41" Sep 29 10:15:32 crc kubenswrapper[4922]: E0929 10:15:32.039365 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41\": container with ID starting with f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41 not found: ID does not exist" containerID="f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.039397 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41"} err="failed to get container status \"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41\": rpc error: code = NotFound desc = could not find container \"f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41\": container with ID starting with f06f35dd5e5b1913ca1af3c1d0e3c4134a6b9aff3543d57e89f36847ed31ca41 not found: ID does not exist" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.039419 4922 scope.go:117] "RemoveContainer" containerID="596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790" Sep 29 10:15:32 crc kubenswrapper[4922]: E0929 10:15:32.039745 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790\": container with ID starting with 596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790 not found: ID does not exist" containerID="596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790" Sep 29 10:15:32 crc kubenswrapper[4922]: I0929 10:15:32.039814 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790"} err="failed to get container status \"596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790\": rpc error: code = NotFound desc = could not find container \"596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790\": container with ID starting with 596f75655e47700e4ddfe9381510ffa3bbfe045dcf7683d13c9e0b5bd395a790 not found: ID does not exist" Sep 29 10:15:33 crc kubenswrapper[4922]: I0929 10:15:33.466878 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" path="/var/lib/kubelet/pods/911aaa77-9ebf-4869-9cc6-08db1685a9f1/volumes" Sep 29 10:16:06 crc kubenswrapper[4922]: I0929 10:16:06.272283 4922 generic.go:334] "Generic (PLEG): container finished" podID="d69265aa-752a-4d25-9af4-6dd389d13e8a" containerID="0291c3ba3281c9e38e32fc492d5c085fc08bfeaec15dea961adacd8b11362478" exitCode=0 Sep 29 10:16:06 crc kubenswrapper[4922]: I0929 10:16:06.273099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" event={"ID":"d69265aa-752a-4d25-9af4-6dd389d13e8a","Type":"ContainerDied","Data":"0291c3ba3281c9e38e32fc492d5c085fc08bfeaec15dea961adacd8b11362478"} Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.776462 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.939846 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940016 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940069 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940110 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940152 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92t96\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940298 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940327 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940363 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940407 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940520 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940577 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940652 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.940693 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle\") pod \"d69265aa-752a-4d25-9af4-6dd389d13e8a\" (UID: \"d69265aa-752a-4d25-9af4-6dd389d13e8a\") " Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.951177 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.951214 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.951267 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.951403 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96" (OuterVolumeSpecName: "kube-api-access-92t96") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "kube-api-access-92t96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.951989 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.952097 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.954308 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.954331 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.955269 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.956048 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.957655 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.961230 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.984563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory" (OuterVolumeSpecName: "inventory") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:07 crc kubenswrapper[4922]: I0929 10:16:07.988000 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d69265aa-752a-4d25-9af4-6dd389d13e8a" (UID: "d69265aa-752a-4d25-9af4-6dd389d13e8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043502 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043550 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043563 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043576 4922 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043585 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92t96\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-kube-api-access-92t96\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043595 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043606 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043615 4922 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043624 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043634 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043643 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043652 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043661 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69265aa-752a-4d25-9af4-6dd389d13e8a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.043670 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d69265aa-752a-4d25-9af4-6dd389d13e8a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.298489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" event={"ID":"d69265aa-752a-4d25-9af4-6dd389d13e8a","Type":"ContainerDied","Data":"1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085"} Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.298559 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d2acf1757b9a5bc4bc4efd3e2bb317b1b0bd5a357dcd4c17cc7750c4df7e085" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.298608 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5sq24" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.538027 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz"] Sep 29 10:16:08 crc kubenswrapper[4922]: E0929 10:16:08.538722 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="extract-utilities" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.538754 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="extract-utilities" Sep 29 10:16:08 crc kubenswrapper[4922]: E0929 10:16:08.538790 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="extract-content" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.538801 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="extract-content" Sep 29 10:16:08 crc kubenswrapper[4922]: E0929 10:16:08.541722 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69265aa-752a-4d25-9af4-6dd389d13e8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.541740 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69265aa-752a-4d25-9af4-6dd389d13e8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:08 crc kubenswrapper[4922]: E0929 10:16:08.541794 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="registry-server" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.541804 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="registry-server" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.542184 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69265aa-752a-4d25-9af4-6dd389d13e8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.542230 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="911aaa77-9ebf-4869-9cc6-08db1685a9f1" containerName="registry-server" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.543163 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.546575 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.546968 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.547215 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.548102 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.548331 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.553153 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz"] Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.658329 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.658399 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.658884 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.659340 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.659501 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wvz6\" (UniqueName: \"kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.761279 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.761741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.761893 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wvz6\" (UniqueName: \"kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.761992 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.762097 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.763060 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.767720 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.768056 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.772878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.787727 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wvz6\" (UniqueName: \"kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fz6hz\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:08 crc kubenswrapper[4922]: I0929 10:16:08.866653 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:16:09 crc kubenswrapper[4922]: I0929 10:16:09.465593 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz"] Sep 29 10:16:10 crc kubenswrapper[4922]: I0929 10:16:10.320434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" event={"ID":"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e","Type":"ContainerStarted","Data":"03033bd7db23f1542c47b70b8e16de6c863b243ed2b06272b21bb2fd1ca252c4"} Sep 29 10:16:11 crc kubenswrapper[4922]: I0929 10:16:11.331421 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" event={"ID":"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e","Type":"ContainerStarted","Data":"d597b60f2dfb01dff5b53115060a3c5c0a295e90c3221abd3146e14cb3063a10"} Sep 29 10:16:11 crc kubenswrapper[4922]: I0929 10:16:11.367219 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" podStartSLOduration=2.7196301 podStartE2EDuration="3.367184011s" podCreationTimestamp="2025-09-29 10:16:08 +0000 UTC" firstStartedPulling="2025-09-29 10:16:09.471553729 +0000 UTC m=+1894.837783993" lastFinishedPulling="2025-09-29 10:16:10.11910764 +0000 UTC m=+1895.485337904" observedRunningTime="2025-09-29 10:16:11.356331644 +0000 UTC m=+1896.722561928" watchObservedRunningTime="2025-09-29 10:16:11.367184011 +0000 UTC m=+1896.733414315" Sep 29 10:16:59 crc kubenswrapper[4922]: I0929 10:16:59.070976 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:16:59 crc kubenswrapper[4922]: I0929 10:16:59.071659 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:17:17 crc kubenswrapper[4922]: I0929 10:17:17.023998 4922 generic.go:334] "Generic (PLEG): container finished" podID="4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" containerID="d597b60f2dfb01dff5b53115060a3c5c0a295e90c3221abd3146e14cb3063a10" exitCode=0 Sep 29 10:17:17 crc kubenswrapper[4922]: I0929 10:17:17.024138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" event={"ID":"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e","Type":"ContainerDied","Data":"d597b60f2dfb01dff5b53115060a3c5c0a295e90c3221abd3146e14cb3063a10"} Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.509821 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.562361 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key\") pod \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.562534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle\") pod \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.562668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0\") pod \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.562915 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory\") pod \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.562989 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wvz6\" (UniqueName: \"kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6\") pod \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\" (UID: \"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e\") " Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.569999 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" (UID: "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.570042 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6" (OuterVolumeSpecName: "kube-api-access-7wvz6") pod "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" (UID: "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e"). InnerVolumeSpecName "kube-api-access-7wvz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.596181 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory" (OuterVolumeSpecName: "inventory") pod "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" (UID: "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.600148 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" (UID: "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.601460 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" (UID: "4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.666858 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.666906 4922 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.666921 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.666932 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wvz6\" (UniqueName: \"kubernetes.io/projected/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-kube-api-access-7wvz6\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:18 crc kubenswrapper[4922]: I0929 10:17:18.666941 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.055696 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" event={"ID":"4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e","Type":"ContainerDied","Data":"03033bd7db23f1542c47b70b8e16de6c863b243ed2b06272b21bb2fd1ca252c4"} Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.055758 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03033bd7db23f1542c47b70b8e16de6c863b243ed2b06272b21bb2fd1ca252c4" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.055803 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fz6hz" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.161523 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff"] Sep 29 10:17:19 crc kubenswrapper[4922]: E0929 10:17:19.162044 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.162065 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.162302 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.163093 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.166751 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.167107 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.167470 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.168010 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.168016 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.168538 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.196430 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff"] Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.278209 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.278290 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl425\" (UniqueName: \"kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.278554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.278889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.279020 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.279099 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381023 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381118 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381187 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381240 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl425\" (UniqueName: \"kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.381342 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.388024 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.388100 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.390116 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.390935 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.391022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.404235 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl425\" (UniqueName: \"kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:19 crc kubenswrapper[4922]: I0929 10:17:19.496556 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:17:20 crc kubenswrapper[4922]: I0929 10:17:20.056728 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff"] Sep 29 10:17:20 crc kubenswrapper[4922]: I0929 10:17:20.066961 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:17:21 crc kubenswrapper[4922]: I0929 10:17:21.087851 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" event={"ID":"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7","Type":"ContainerStarted","Data":"d1335ca49a6ff6c55419343665df15380a8c0c07044ed6cabb141308a8e472a5"} Sep 29 10:17:21 crc kubenswrapper[4922]: I0929 10:17:21.088358 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" event={"ID":"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7","Type":"ContainerStarted","Data":"9b901bf617448cc1a80c8399feff13eb87abba45af53660fd3978a015e331dfd"} Sep 29 10:17:21 crc kubenswrapper[4922]: I0929 10:17:21.110599 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" podStartSLOduration=1.474835474 podStartE2EDuration="2.110576873s" podCreationTimestamp="2025-09-29 10:17:19 +0000 UTC" firstStartedPulling="2025-09-29 10:17:20.06667796 +0000 UTC m=+1965.432908224" lastFinishedPulling="2025-09-29 10:17:20.702419359 +0000 UTC m=+1966.068649623" observedRunningTime="2025-09-29 10:17:21.109731581 +0000 UTC m=+1966.475961845" watchObservedRunningTime="2025-09-29 10:17:21.110576873 +0000 UTC m=+1966.476807137" Sep 29 10:17:29 crc kubenswrapper[4922]: I0929 10:17:29.070899 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:17:29 crc kubenswrapper[4922]: I0929 10:17:29.071823 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.071229 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.072010 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.072090 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.072946 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.073003 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f" gracePeriod=600 Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.493395 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f" exitCode=0 Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.493482 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f"} Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.493878 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5"} Sep 29 10:17:59 crc kubenswrapper[4922]: I0929 10:17:59.493906 4922 scope.go:117] "RemoveContainer" containerID="3638cb869411b2603cfb34a84808b8b3b0efe81654b61daa8d43b229ff1144fd" Sep 29 10:18:09 crc kubenswrapper[4922]: I0929 10:18:09.633951 4922 generic.go:334] "Generic (PLEG): container finished" podID="f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" containerID="d1335ca49a6ff6c55419343665df15380a8c0c07044ed6cabb141308a8e472a5" exitCode=0 Sep 29 10:18:09 crc kubenswrapper[4922]: I0929 10:18:09.634007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" event={"ID":"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7","Type":"ContainerDied","Data":"d1335ca49a6ff6c55419343665df15380a8c0c07044ed6cabb141308a8e472a5"} Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.080959 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.205403 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl425\" (UniqueName: \"kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.206156 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.206247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.206357 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.206408 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.206489 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0\") pod \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\" (UID: \"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7\") " Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.213790 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425" (OuterVolumeSpecName: "kube-api-access-fl425") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "kube-api-access-fl425". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.214036 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.234856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.235181 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory" (OuterVolumeSpecName: "inventory") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.265820 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.266922 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" (UID: "f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309116 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309169 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309188 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309204 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309218 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.309232 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl425\" (UniqueName: \"kubernetes.io/projected/f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7-kube-api-access-fl425\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.653886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" event={"ID":"f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7","Type":"ContainerDied","Data":"9b901bf617448cc1a80c8399feff13eb87abba45af53660fd3978a015e331dfd"} Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.654198 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b901bf617448cc1a80c8399feff13eb87abba45af53660fd3978a015e331dfd" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.654011 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.822146 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg"] Sep 29 10:18:11 crc kubenswrapper[4922]: E0929 10:18:11.822630 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.822684 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.822923 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.824949 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.827141 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.828181 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.828336 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.828627 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.832299 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 29 10:18:11 crc kubenswrapper[4922]: I0929 10:18:11.835145 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg"] Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.022370 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.022781 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.022910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.022944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfgv\" (UniqueName: \"kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.022991 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.124463 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.124613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.124651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfgv\" (UniqueName: \"kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.124701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.124783 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.129179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.129423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.130176 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.132736 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.147376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfgv\" (UniqueName: \"kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vvllg\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.190774 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:18:12 crc kubenswrapper[4922]: I0929 10:18:12.735364 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg"] Sep 29 10:18:12 crc kubenswrapper[4922]: W0929 10:18:12.736598 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcdc9bf2_2da5_4261_89b5_dd6111d25d3b.slice/crio-a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae WatchSource:0}: Error finding container a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae: Status 404 returned error can't find the container with id a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae Sep 29 10:18:13 crc kubenswrapper[4922]: I0929 10:18:13.674050 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" event={"ID":"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b","Type":"ContainerStarted","Data":"9aa0fcb85a52fa282bb779869f6d61c7714be5f21ee1d52f1cddacb5dba180dc"} Sep 29 10:18:13 crc kubenswrapper[4922]: I0929 10:18:13.674456 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" event={"ID":"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b","Type":"ContainerStarted","Data":"a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae"} Sep 29 10:18:13 crc kubenswrapper[4922]: I0929 10:18:13.697353 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" podStartSLOduration=2.214588076 podStartE2EDuration="2.697331832s" podCreationTimestamp="2025-09-29 10:18:11 +0000 UTC" firstStartedPulling="2025-09-29 10:18:12.740196201 +0000 UTC m=+2018.106426465" lastFinishedPulling="2025-09-29 10:18:13.222939957 +0000 UTC m=+2018.589170221" observedRunningTime="2025-09-29 10:18:13.69570135 +0000 UTC m=+2019.061931634" watchObservedRunningTime="2025-09-29 10:18:13.697331832 +0000 UTC m=+2019.063562096" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.067892 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.071733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.084824 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.165731 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.165849 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.165884 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm28t\" (UniqueName: \"kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.268039 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.268117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.268148 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm28t\" (UniqueName: \"kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.268707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.268802 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.291484 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm28t\" (UniqueName: \"kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t\") pod \"certified-operators-rpcx5\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.413515 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.972866 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:43 crc kubenswrapper[4922]: I0929 10:18:43.990793 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerStarted","Data":"4f01391bb1910eb9aec0822cbd28177dbcafbb77088babb60102ed7442fa2083"} Sep 29 10:18:45 crc kubenswrapper[4922]: I0929 10:18:45.001807 4922 generic.go:334] "Generic (PLEG): container finished" podID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerID="eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c" exitCode=0 Sep 29 10:18:45 crc kubenswrapper[4922]: I0929 10:18:45.001873 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerDied","Data":"eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c"} Sep 29 10:18:46 crc kubenswrapper[4922]: I0929 10:18:46.014202 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerStarted","Data":"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd"} Sep 29 10:18:47 crc kubenswrapper[4922]: I0929 10:18:47.028971 4922 generic.go:334] "Generic (PLEG): container finished" podID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerID="3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd" exitCode=0 Sep 29 10:18:47 crc kubenswrapper[4922]: I0929 10:18:47.029159 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerDied","Data":"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd"} Sep 29 10:18:48 crc kubenswrapper[4922]: I0929 10:18:48.045943 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerStarted","Data":"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b"} Sep 29 10:18:48 crc kubenswrapper[4922]: I0929 10:18:48.082389 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rpcx5" podStartSLOduration=2.468146705 podStartE2EDuration="5.082355707s" podCreationTimestamp="2025-09-29 10:18:43 +0000 UTC" firstStartedPulling="2025-09-29 10:18:45.004149645 +0000 UTC m=+2050.370379909" lastFinishedPulling="2025-09-29 10:18:47.618358647 +0000 UTC m=+2052.984588911" observedRunningTime="2025-09-29 10:18:48.073742789 +0000 UTC m=+2053.439973073" watchObservedRunningTime="2025-09-29 10:18:48.082355707 +0000 UTC m=+2053.448585991" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.234389 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.237158 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.251796 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.326076 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.326135 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.326476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.428868 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.428933 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.429030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.429685 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.429751 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.452923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h\") pod \"redhat-marketplace-79k55\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:50 crc kubenswrapper[4922]: I0929 10:18:50.570197 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:18:51 crc kubenswrapper[4922]: I0929 10:18:51.064785 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:18:51 crc kubenswrapper[4922]: W0929 10:18:51.071023 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffc9c2c_d805_43aa_85dc_e299de9618fa.slice/crio-0f5b975f60e53b47b8b3f0eaaa3b88bf4a0710103eb649b64442c0638d96d464 WatchSource:0}: Error finding container 0f5b975f60e53b47b8b3f0eaaa3b88bf4a0710103eb649b64442c0638d96d464: Status 404 returned error can't find the container with id 0f5b975f60e53b47b8b3f0eaaa3b88bf4a0710103eb649b64442c0638d96d464 Sep 29 10:18:52 crc kubenswrapper[4922]: I0929 10:18:52.092547 4922 generic.go:334] "Generic (PLEG): container finished" podID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerID="a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294" exitCode=0 Sep 29 10:18:52 crc kubenswrapper[4922]: I0929 10:18:52.092602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerDied","Data":"a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294"} Sep 29 10:18:52 crc kubenswrapper[4922]: I0929 10:18:52.092973 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerStarted","Data":"0f5b975f60e53b47b8b3f0eaaa3b88bf4a0710103eb649b64442c0638d96d464"} Sep 29 10:18:53 crc kubenswrapper[4922]: I0929 10:18:53.414363 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:53 crc kubenswrapper[4922]: I0929 10:18:53.414795 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:53 crc kubenswrapper[4922]: I0929 10:18:53.479193 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:54 crc kubenswrapper[4922]: I0929 10:18:54.115317 4922 generic.go:334] "Generic (PLEG): container finished" podID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerID="c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f" exitCode=0 Sep 29 10:18:54 crc kubenswrapper[4922]: I0929 10:18:54.115665 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerDied","Data":"c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f"} Sep 29 10:18:54 crc kubenswrapper[4922]: I0929 10:18:54.180627 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:55 crc kubenswrapper[4922]: I0929 10:18:55.019139 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:55 crc kubenswrapper[4922]: I0929 10:18:55.130272 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerStarted","Data":"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db"} Sep 29 10:18:55 crc kubenswrapper[4922]: I0929 10:18:55.160866 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79k55" podStartSLOduration=2.583723981 podStartE2EDuration="5.160816251s" podCreationTimestamp="2025-09-29 10:18:50 +0000 UTC" firstStartedPulling="2025-09-29 10:18:52.096417459 +0000 UTC m=+2057.462647723" lastFinishedPulling="2025-09-29 10:18:54.673509729 +0000 UTC m=+2060.039739993" observedRunningTime="2025-09-29 10:18:55.151130055 +0000 UTC m=+2060.517360329" watchObservedRunningTime="2025-09-29 10:18:55.160816251 +0000 UTC m=+2060.527046515" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.143097 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rpcx5" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="registry-server" containerID="cri-o://4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b" gracePeriod=2 Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.672058 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.780198 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm28t\" (UniqueName: \"kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t\") pod \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.780423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities\") pod \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.780653 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content\") pod \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\" (UID: \"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69\") " Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.783167 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities" (OuterVolumeSpecName: "utilities") pod "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" (UID: "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.790604 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t" (OuterVolumeSpecName: "kube-api-access-bm28t") pod "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" (UID: "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69"). InnerVolumeSpecName "kube-api-access-bm28t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.848427 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" (UID: "2bbe74a9-d743-4b14-81d5-a6cd0b28aa69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.886591 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.886655 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm28t\" (UniqueName: \"kubernetes.io/projected/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-kube-api-access-bm28t\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:56 crc kubenswrapper[4922]: I0929 10:18:56.886675 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.156537 4922 generic.go:334] "Generic (PLEG): container finished" podID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerID="4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b" exitCode=0 Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.156603 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerDied","Data":"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b"} Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.156654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpcx5" event={"ID":"2bbe74a9-d743-4b14-81d5-a6cd0b28aa69","Type":"ContainerDied","Data":"4f01391bb1910eb9aec0822cbd28177dbcafbb77088babb60102ed7442fa2083"} Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.156679 4922 scope.go:117] "RemoveContainer" containerID="4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.156732 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpcx5" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.181060 4922 scope.go:117] "RemoveContainer" containerID="3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.195276 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.205991 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rpcx5"] Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.220284 4922 scope.go:117] "RemoveContainer" containerID="eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.249814 4922 scope.go:117] "RemoveContainer" containerID="4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b" Sep 29 10:18:57 crc kubenswrapper[4922]: E0929 10:18:57.250632 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b\": container with ID starting with 4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b not found: ID does not exist" containerID="4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.250710 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b"} err="failed to get container status \"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b\": rpc error: code = NotFound desc = could not find container \"4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b\": container with ID starting with 4298bd47fa7ef7495e8d0c6871aff34c88c7135e217689472bb4f0ae037a728b not found: ID does not exist" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.250770 4922 scope.go:117] "RemoveContainer" containerID="3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd" Sep 29 10:18:57 crc kubenswrapper[4922]: E0929 10:18:57.251257 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd\": container with ID starting with 3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd not found: ID does not exist" containerID="3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.251309 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd"} err="failed to get container status \"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd\": rpc error: code = NotFound desc = could not find container \"3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd\": container with ID starting with 3b8cf3ba730b3fdd8fa56ff99e18da3997aaf5598907debf58d8214899fef4fd not found: ID does not exist" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.251327 4922 scope.go:117] "RemoveContainer" containerID="eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c" Sep 29 10:18:57 crc kubenswrapper[4922]: E0929 10:18:57.251573 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c\": container with ID starting with eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c not found: ID does not exist" containerID="eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.251627 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c"} err="failed to get container status \"eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c\": rpc error: code = NotFound desc = could not find container \"eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c\": container with ID starting with eb995335ef4c3e9071bc5f0e537e3d8645554aaa826e04da61ddda469a33f47c not found: ID does not exist" Sep 29 10:18:57 crc kubenswrapper[4922]: I0929 10:18:57.467716 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" path="/var/lib/kubelet/pods/2bbe74a9-d743-4b14-81d5-a6cd0b28aa69/volumes" Sep 29 10:19:00 crc kubenswrapper[4922]: I0929 10:19:00.571179 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:00 crc kubenswrapper[4922]: I0929 10:19:00.572147 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:00 crc kubenswrapper[4922]: I0929 10:19:00.626498 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:01 crc kubenswrapper[4922]: I0929 10:19:01.250012 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:01 crc kubenswrapper[4922]: I0929 10:19:01.821221 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.220270 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79k55" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="registry-server" containerID="cri-o://71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db" gracePeriod=2 Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.699250 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.752917 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h\") pod \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.752987 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content\") pod \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.753057 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities\") pod \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\" (UID: \"9ffc9c2c-d805-43aa-85dc-e299de9618fa\") " Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.754252 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities" (OuterVolumeSpecName: "utilities") pod "9ffc9c2c-d805-43aa-85dc-e299de9618fa" (UID: "9ffc9c2c-d805-43aa-85dc-e299de9618fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.763531 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h" (OuterVolumeSpecName: "kube-api-access-rgt6h") pod "9ffc9c2c-d805-43aa-85dc-e299de9618fa" (UID: "9ffc9c2c-d805-43aa-85dc-e299de9618fa"). InnerVolumeSpecName "kube-api-access-rgt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.773188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ffc9c2c-d805-43aa-85dc-e299de9618fa" (UID: "9ffc9c2c-d805-43aa-85dc-e299de9618fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.856084 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.856358 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/9ffc9c2c-d805-43aa-85dc-e299de9618fa-kube-api-access-rgt6h\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:03 crc kubenswrapper[4922]: I0929 10:19:03.856434 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffc9c2c-d805-43aa-85dc-e299de9618fa-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.232554 4922 generic.go:334] "Generic (PLEG): container finished" podID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerID="71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db" exitCode=0 Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.232617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerDied","Data":"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db"} Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.232655 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79k55" event={"ID":"9ffc9c2c-d805-43aa-85dc-e299de9618fa","Type":"ContainerDied","Data":"0f5b975f60e53b47b8b3f0eaaa3b88bf4a0710103eb649b64442c0638d96d464"} Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.232659 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79k55" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.232680 4922 scope.go:117] "RemoveContainer" containerID="71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.261251 4922 scope.go:117] "RemoveContainer" containerID="c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.282737 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.292752 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79k55"] Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.327147 4922 scope.go:117] "RemoveContainer" containerID="a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.424404 4922 scope.go:117] "RemoveContainer" containerID="71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.433105 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db\": container with ID starting with 71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db not found: ID does not exist" containerID="71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.433171 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db"} err="failed to get container status \"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db\": rpc error: code = NotFound desc = could not find container \"71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db\": container with ID starting with 71c23ec367c1f3086c5845af78ddc705adb9b048a459739183031ac517b4c7db not found: ID does not exist" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.433209 4922 scope.go:117] "RemoveContainer" containerID="c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.436230 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f\": container with ID starting with c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f not found: ID does not exist" containerID="c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.436279 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f"} err="failed to get container status \"c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f\": rpc error: code = NotFound desc = could not find container \"c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f\": container with ID starting with c2d3b2d6ab8ef191362e553602349f1c59bfe61f827588e19ea3bf7a9cd7681f not found: ID does not exist" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.436303 4922 scope.go:117] "RemoveContainer" containerID="a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.440515 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294\": container with ID starting with a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294 not found: ID does not exist" containerID="a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.440571 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294"} err="failed to get container status \"a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294\": rpc error: code = NotFound desc = could not find container \"a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294\": container with ID starting with a86bb7bac5d14d99c922ecfc6ba293f794374dbb5ba8ce5857d1a699cd880294 not found: ID does not exist" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.627333 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628154 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="extract-content" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628234 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="extract-content" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628313 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628365 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628445 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="extract-utilities" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628499 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="extract-utilities" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628588 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="extract-content" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628646 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="extract-content" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628725 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="extract-utilities" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628780 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="extract-utilities" Sep 29 10:19:04 crc kubenswrapper[4922]: E0929 10:19:04.628867 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.628936 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.629186 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.629278 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbe74a9-d743-4b14-81d5-a6cd0b28aa69" containerName="registry-server" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.634515 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.644028 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.685463 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.685819 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvldg\" (UniqueName: \"kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.686748 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.789203 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.789288 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.789319 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvldg\" (UniqueName: \"kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.790140 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.790468 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.812415 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvldg\" (UniqueName: \"kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg\") pod \"redhat-operators-qhntm\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:04 crc kubenswrapper[4922]: I0929 10:19:04.958055 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:05 crc kubenswrapper[4922]: I0929 10:19:05.449265 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:05 crc kubenswrapper[4922]: I0929 10:19:05.467943 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffc9c2c-d805-43aa-85dc-e299de9618fa" path="/var/lib/kubelet/pods/9ffc9c2c-d805-43aa-85dc-e299de9618fa/volumes" Sep 29 10:19:06 crc kubenswrapper[4922]: I0929 10:19:06.258748 4922 generic.go:334] "Generic (PLEG): container finished" podID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerID="bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372" exitCode=0 Sep 29 10:19:06 crc kubenswrapper[4922]: I0929 10:19:06.258820 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerDied","Data":"bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372"} Sep 29 10:19:06 crc kubenswrapper[4922]: I0929 10:19:06.259140 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerStarted","Data":"b721b0da8b38b94ee1d2abc2cbbaed1d602b9673680ee401c0deef521ffca49f"} Sep 29 10:19:08 crc kubenswrapper[4922]: I0929 10:19:08.293661 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerStarted","Data":"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e"} Sep 29 10:19:09 crc kubenswrapper[4922]: I0929 10:19:09.307506 4922 generic.go:334] "Generic (PLEG): container finished" podID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerID="fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e" exitCode=0 Sep 29 10:19:09 crc kubenswrapper[4922]: I0929 10:19:09.307562 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerDied","Data":"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e"} Sep 29 10:19:11 crc kubenswrapper[4922]: I0929 10:19:11.332466 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerStarted","Data":"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a"} Sep 29 10:19:11 crc kubenswrapper[4922]: I0929 10:19:11.361014 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qhntm" podStartSLOduration=3.389526143 podStartE2EDuration="7.360991664s" podCreationTimestamp="2025-09-29 10:19:04 +0000 UTC" firstStartedPulling="2025-09-29 10:19:06.262077329 +0000 UTC m=+2071.628307593" lastFinishedPulling="2025-09-29 10:19:10.23354285 +0000 UTC m=+2075.599773114" observedRunningTime="2025-09-29 10:19:11.357677281 +0000 UTC m=+2076.723907555" watchObservedRunningTime="2025-09-29 10:19:11.360991664 +0000 UTC m=+2076.727221928" Sep 29 10:19:14 crc kubenswrapper[4922]: I0929 10:19:14.958446 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:14 crc kubenswrapper[4922]: I0929 10:19:14.959314 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:16 crc kubenswrapper[4922]: I0929 10:19:16.020171 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhntm" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="registry-server" probeResult="failure" output=< Sep 29 10:19:16 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:19:16 crc kubenswrapper[4922]: > Sep 29 10:19:25 crc kubenswrapper[4922]: I0929 10:19:25.016633 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:25 crc kubenswrapper[4922]: I0929 10:19:25.067757 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:25 crc kubenswrapper[4922]: I0929 10:19:25.275516 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:26 crc kubenswrapper[4922]: I0929 10:19:26.488396 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qhntm" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="registry-server" containerID="cri-o://2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a" gracePeriod=2 Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.012083 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.036777 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content\") pod \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.037068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities\") pod \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.037172 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvldg\" (UniqueName: \"kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg\") pod \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\" (UID: \"c39be65c-4dcb-4ac3-aee7-f0237de151cb\") " Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.040171 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities" (OuterVolumeSpecName: "utilities") pod "c39be65c-4dcb-4ac3-aee7-f0237de151cb" (UID: "c39be65c-4dcb-4ac3-aee7-f0237de151cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.045278 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg" (OuterVolumeSpecName: "kube-api-access-gvldg") pod "c39be65c-4dcb-4ac3-aee7-f0237de151cb" (UID: "c39be65c-4dcb-4ac3-aee7-f0237de151cb"). InnerVolumeSpecName "kube-api-access-gvldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.135981 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c39be65c-4dcb-4ac3-aee7-f0237de151cb" (UID: "c39be65c-4dcb-4ac3-aee7-f0237de151cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.140445 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.140482 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvldg\" (UniqueName: \"kubernetes.io/projected/c39be65c-4dcb-4ac3-aee7-f0237de151cb-kube-api-access-gvldg\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.140498 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39be65c-4dcb-4ac3-aee7-f0237de151cb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.506757 4922 generic.go:334] "Generic (PLEG): container finished" podID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerID="2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a" exitCode=0 Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.506885 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerDied","Data":"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a"} Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.506921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhntm" event={"ID":"c39be65c-4dcb-4ac3-aee7-f0237de151cb","Type":"ContainerDied","Data":"b721b0da8b38b94ee1d2abc2cbbaed1d602b9673680ee401c0deef521ffca49f"} Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.506967 4922 scope.go:117] "RemoveContainer" containerID="2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.507036 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhntm" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.541786 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.541976 4922 scope.go:117] "RemoveContainer" containerID="fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.549324 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qhntm"] Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.572317 4922 scope.go:117] "RemoveContainer" containerID="bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.616393 4922 scope.go:117] "RemoveContainer" containerID="2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a" Sep 29 10:19:27 crc kubenswrapper[4922]: E0929 10:19:27.617041 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a\": container with ID starting with 2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a not found: ID does not exist" containerID="2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.617120 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a"} err="failed to get container status \"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a\": rpc error: code = NotFound desc = could not find container \"2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a\": container with ID starting with 2447c6b8b6b140d3ca14a6584f76ee6f5774a06afe0e26b415f4b631e9dac47a not found: ID does not exist" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.617158 4922 scope.go:117] "RemoveContainer" containerID="fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e" Sep 29 10:19:27 crc kubenswrapper[4922]: E0929 10:19:27.617738 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e\": container with ID starting with fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e not found: ID does not exist" containerID="fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.617796 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e"} err="failed to get container status \"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e\": rpc error: code = NotFound desc = could not find container \"fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e\": container with ID starting with fefec35e76dcf7e368f3a54ff1dc095b936e2a0b5509368b11d58ba05a7f5f5e not found: ID does not exist" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.617847 4922 scope.go:117] "RemoveContainer" containerID="bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372" Sep 29 10:19:27 crc kubenswrapper[4922]: E0929 10:19:27.618178 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372\": container with ID starting with bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372 not found: ID does not exist" containerID="bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372" Sep 29 10:19:27 crc kubenswrapper[4922]: I0929 10:19:27.618255 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372"} err="failed to get container status \"bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372\": rpc error: code = NotFound desc = could not find container \"bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372\": container with ID starting with bcd3f7cc77cd3617cf338c6e6106ccaf4144f40618c182b629313133bebbd372 not found: ID does not exist" Sep 29 10:19:29 crc kubenswrapper[4922]: I0929 10:19:29.464385 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" path="/var/lib/kubelet/pods/c39be65c-4dcb-4ac3-aee7-f0237de151cb/volumes" Sep 29 10:19:59 crc kubenswrapper[4922]: I0929 10:19:59.070595 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:19:59 crc kubenswrapper[4922]: I0929 10:19:59.071340 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:20:29 crc kubenswrapper[4922]: I0929 10:20:29.070968 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:20:29 crc kubenswrapper[4922]: I0929 10:20:29.071499 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.070492 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.071037 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.071095 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.072032 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.072089 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" gracePeriod=600 Sep 29 10:20:59 crc kubenswrapper[4922]: E0929 10:20:59.202090 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.495461 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" exitCode=0 Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.495537 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5"} Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.495813 4922 scope.go:117] "RemoveContainer" containerID="87aa8b737efcb2fba85be231c4d660d1fe7fd4ea8854d1345f029fc3d1db3b1f" Sep 29 10:20:59 crc kubenswrapper[4922]: I0929 10:20:59.496451 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:20:59 crc kubenswrapper[4922]: E0929 10:20:59.496902 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:21:13 crc kubenswrapper[4922]: I0929 10:21:13.452393 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:21:13 crc kubenswrapper[4922]: E0929 10:21:13.453485 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:21:24 crc kubenswrapper[4922]: I0929 10:21:24.452040 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:21:24 crc kubenswrapper[4922]: E0929 10:21:24.453689 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:21:39 crc kubenswrapper[4922]: I0929 10:21:39.452369 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:21:39 crc kubenswrapper[4922]: E0929 10:21:39.454953 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:21:52 crc kubenswrapper[4922]: I0929 10:21:52.452739 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:21:52 crc kubenswrapper[4922]: E0929 10:21:52.453939 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:22:04 crc kubenswrapper[4922]: I0929 10:22:04.452393 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:22:04 crc kubenswrapper[4922]: E0929 10:22:04.454177 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:22:09 crc kubenswrapper[4922]: I0929 10:22:09.779516 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-5f74f76895-9f28s" podUID="1b044ac1-a144-454a-a2f7-bf438ba13cc0" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 29 10:22:19 crc kubenswrapper[4922]: I0929 10:22:19.452699 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:22:19 crc kubenswrapper[4922]: E0929 10:22:19.454020 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:22:22 crc kubenswrapper[4922]: I0929 10:22:22.388029 4922 generic.go:334] "Generic (PLEG): container finished" podID="bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" containerID="9aa0fcb85a52fa282bb779869f6d61c7714be5f21ee1d52f1cddacb5dba180dc" exitCode=0 Sep 29 10:22:22 crc kubenswrapper[4922]: I0929 10:22:22.388141 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" event={"ID":"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b","Type":"ContainerDied","Data":"9aa0fcb85a52fa282bb779869f6d61c7714be5f21ee1d52f1cddacb5dba180dc"} Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.877753 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.935726 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle\") pod \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.935876 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory\") pod \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.935937 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key\") pod \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.936078 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0\") pod \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.936219 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbfgv\" (UniqueName: \"kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv\") pod \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\" (UID: \"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b\") " Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.942985 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" (UID: "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.945106 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv" (OuterVolumeSpecName: "kube-api-access-vbfgv") pod "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" (UID: "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b"). InnerVolumeSpecName "kube-api-access-vbfgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.970488 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory" (OuterVolumeSpecName: "inventory") pod "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" (UID: "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.970975 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" (UID: "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:23 crc kubenswrapper[4922]: I0929 10:22:23.975105 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" (UID: "bcdc9bf2-2da5-4261-89b5-dd6111d25d3b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.039192 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.039237 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbfgv\" (UniqueName: \"kubernetes.io/projected/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-kube-api-access-vbfgv\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.039252 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.039265 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.039276 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcdc9bf2-2da5-4261-89b5-dd6111d25d3b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.415604 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" event={"ID":"bcdc9bf2-2da5-4261-89b5-dd6111d25d3b","Type":"ContainerDied","Data":"a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae"} Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.415681 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1915e93194f7a3f2dd49fc346c5800439e160c31e9d376541ddfa96e2588aae" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.415770 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vvllg" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.537507 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl"] Sep 29 10:22:24 crc kubenswrapper[4922]: E0929 10:22:24.538435 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="registry-server" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538463 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="registry-server" Sep 29 10:22:24 crc kubenswrapper[4922]: E0929 10:22:24.538509 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="extract-utilities" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538520 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="extract-utilities" Sep 29 10:22:24 crc kubenswrapper[4922]: E0929 10:22:24.538550 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538562 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:24 crc kubenswrapper[4922]: E0929 10:22:24.538582 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="extract-content" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538591 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="extract-content" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538898 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdc9bf2-2da5-4261-89b5-dd6111d25d3b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.538941 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39be65c-4dcb-4ac3-aee7-f0237de151cb" containerName="registry-server" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.540123 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.548609 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.548609 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.548872 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.551261 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.551381 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.551856 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl"] Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.551877 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.551973 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.653576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.653695 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.653732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-894vp\" (UniqueName: \"kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.653757 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.653825 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.654634 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.654927 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.655006 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.655087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.757531 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894vp\" (UniqueName: \"kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758442 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758467 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758491 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.758544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.759964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.763091 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.764356 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.764681 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.764860 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.765382 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.770745 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.762951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.782235 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894vp\" (UniqueName: \"kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-575vl\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:24 crc kubenswrapper[4922]: I0929 10:22:24.862026 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:22:25 crc kubenswrapper[4922]: I0929 10:22:25.477542 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl"] Sep 29 10:22:25 crc kubenswrapper[4922]: I0929 10:22:25.491636 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:22:26 crc kubenswrapper[4922]: I0929 10:22:26.437225 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" event={"ID":"39297e68-ef0c-4e52-922d-28805d4a7171","Type":"ContainerStarted","Data":"ef917bf46ee28ebca81a1181e66d189880a3527ea79a83bd0ab86041b71b0218"} Sep 29 10:22:26 crc kubenswrapper[4922]: I0929 10:22:26.437721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" event={"ID":"39297e68-ef0c-4e52-922d-28805d4a7171","Type":"ContainerStarted","Data":"b49b9eed79545cef9e7d12f6ca6b761aa810261f7fd66ed739dd380e079b083c"} Sep 29 10:22:26 crc kubenswrapper[4922]: I0929 10:22:26.464088 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" podStartSLOduration=1.941265648 podStartE2EDuration="2.464058949s" podCreationTimestamp="2025-09-29 10:22:24 +0000 UTC" firstStartedPulling="2025-09-29 10:22:25.491355393 +0000 UTC m=+2270.857585657" lastFinishedPulling="2025-09-29 10:22:26.014148694 +0000 UTC m=+2271.380378958" observedRunningTime="2025-09-29 10:22:26.462466636 +0000 UTC m=+2271.828696900" watchObservedRunningTime="2025-09-29 10:22:26.464058949 +0000 UTC m=+2271.830289213" Sep 29 10:22:30 crc kubenswrapper[4922]: I0929 10:22:30.452802 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:22:30 crc kubenswrapper[4922]: E0929 10:22:30.454030 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:22:45 crc kubenswrapper[4922]: I0929 10:22:45.460773 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:22:45 crc kubenswrapper[4922]: E0929 10:22:45.461734 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:22:58 crc kubenswrapper[4922]: I0929 10:22:58.451689 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:22:58 crc kubenswrapper[4922]: E0929 10:22:58.452545 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:23:12 crc kubenswrapper[4922]: I0929 10:23:12.453008 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:23:12 crc kubenswrapper[4922]: E0929 10:23:12.454124 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:23:25 crc kubenswrapper[4922]: I0929 10:23:25.461115 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:23:25 crc kubenswrapper[4922]: E0929 10:23:25.462169 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:23:40 crc kubenswrapper[4922]: I0929 10:23:40.452540 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:23:40 crc kubenswrapper[4922]: E0929 10:23:40.453793 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:23:52 crc kubenswrapper[4922]: I0929 10:23:52.452689 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:23:52 crc kubenswrapper[4922]: E0929 10:23:52.454055 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:24:05 crc kubenswrapper[4922]: I0929 10:24:05.459548 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:24:05 crc kubenswrapper[4922]: E0929 10:24:05.462321 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:24:16 crc kubenswrapper[4922]: I0929 10:24:16.452269 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:24:16 crc kubenswrapper[4922]: E0929 10:24:16.454269 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:24:28 crc kubenswrapper[4922]: I0929 10:24:28.452873 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:24:28 crc kubenswrapper[4922]: E0929 10:24:28.453927 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:24:42 crc kubenswrapper[4922]: I0929 10:24:42.453552 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:24:42 crc kubenswrapper[4922]: E0929 10:24:42.454390 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:24:57 crc kubenswrapper[4922]: I0929 10:24:57.452972 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:24:57 crc kubenswrapper[4922]: E0929 10:24:57.453636 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:25:11 crc kubenswrapper[4922]: I0929 10:25:11.452613 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:25:11 crc kubenswrapper[4922]: E0929 10:25:11.453420 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:25:24 crc kubenswrapper[4922]: I0929 10:25:24.452052 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:25:24 crc kubenswrapper[4922]: E0929 10:25:24.452796 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.521301 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.533134 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.544319 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.618862 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.619092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.619178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8bh\" (UniqueName: \"kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.721274 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.721872 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.722063 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.722352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.722521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8bh\" (UniqueName: \"kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.748077 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8bh\" (UniqueName: \"kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh\") pod \"community-operators-n2zp9\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:37 crc kubenswrapper[4922]: I0929 10:25:37.867010 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:38 crc kubenswrapper[4922]: I0929 10:25:38.425367 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:39 crc kubenswrapper[4922]: I0929 10:25:39.445389 4922 generic.go:334] "Generic (PLEG): container finished" podID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerID="4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff" exitCode=0 Sep 29 10:25:39 crc kubenswrapper[4922]: I0929 10:25:39.445656 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerDied","Data":"4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff"} Sep 29 10:25:39 crc kubenswrapper[4922]: I0929 10:25:39.445770 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerStarted","Data":"924d5a57e779531e2223638802991c33a2465fc3828b536fa7d48928a5fe7e17"} Sep 29 10:25:39 crc kubenswrapper[4922]: I0929 10:25:39.452413 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:25:39 crc kubenswrapper[4922]: E0929 10:25:39.452769 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:25:42 crc kubenswrapper[4922]: I0929 10:25:42.485340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerStarted","Data":"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4"} Sep 29 10:25:43 crc kubenswrapper[4922]: I0929 10:25:43.499970 4922 generic.go:334] "Generic (PLEG): container finished" podID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerID="10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4" exitCode=0 Sep 29 10:25:43 crc kubenswrapper[4922]: I0929 10:25:43.500064 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerDied","Data":"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4"} Sep 29 10:25:44 crc kubenswrapper[4922]: I0929 10:25:44.512991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerStarted","Data":"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124"} Sep 29 10:25:44 crc kubenswrapper[4922]: I0929 10:25:44.536057 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2zp9" podStartSLOduration=2.796112054 podStartE2EDuration="7.53603422s" podCreationTimestamp="2025-09-29 10:25:37 +0000 UTC" firstStartedPulling="2025-09-29 10:25:39.450062686 +0000 UTC m=+2464.816292950" lastFinishedPulling="2025-09-29 10:25:44.189984852 +0000 UTC m=+2469.556215116" observedRunningTime="2025-09-29 10:25:44.53452818 +0000 UTC m=+2469.900758444" watchObservedRunningTime="2025-09-29 10:25:44.53603422 +0000 UTC m=+2469.902264484" Sep 29 10:25:47 crc kubenswrapper[4922]: I0929 10:25:47.867275 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:47 crc kubenswrapper[4922]: I0929 10:25:47.868441 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:47 crc kubenswrapper[4922]: I0929 10:25:47.931606 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:49 crc kubenswrapper[4922]: I0929 10:25:49.574422 4922 generic.go:334] "Generic (PLEG): container finished" podID="39297e68-ef0c-4e52-922d-28805d4a7171" containerID="ef917bf46ee28ebca81a1181e66d189880a3527ea79a83bd0ab86041b71b0218" exitCode=0 Sep 29 10:25:49 crc kubenswrapper[4922]: I0929 10:25:49.574553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" event={"ID":"39297e68-ef0c-4e52-922d-28805d4a7171","Type":"ContainerDied","Data":"ef917bf46ee28ebca81a1181e66d189880a3527ea79a83bd0ab86041b71b0218"} Sep 29 10:25:49 crc kubenswrapper[4922]: I0929 10:25:49.632722 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:49 crc kubenswrapper[4922]: I0929 10:25:49.686484 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.066013 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.249912 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.249998 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250059 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250206 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-894vp\" (UniqueName: \"kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250247 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250738 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250782 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.250936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.251014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1\") pod \"39297e68-ef0c-4e52-922d-28805d4a7171\" (UID: \"39297e68-ef0c-4e52-922d-28805d4a7171\") " Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.256639 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.257926 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp" (OuterVolumeSpecName: "kube-api-access-894vp") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "kube-api-access-894vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.296099 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.296085 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.296456 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.296508 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.302770 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.303291 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.312138 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory" (OuterVolumeSpecName: "inventory") pod "39297e68-ef0c-4e52-922d-28805d4a7171" (UID: "39297e68-ef0c-4e52-922d-28805d4a7171"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355186 4922 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/39297e68-ef0c-4e52-922d-28805d4a7171-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355221 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355239 4922 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355250 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-894vp\" (UniqueName: \"kubernetes.io/projected/39297e68-ef0c-4e52-922d-28805d4a7171-kube-api-access-894vp\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355396 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355431 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355442 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355455 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.355465 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/39297e68-ef0c-4e52-922d-28805d4a7171-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.597534 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.597519 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-575vl" event={"ID":"39297e68-ef0c-4e52-922d-28805d4a7171","Type":"ContainerDied","Data":"b49b9eed79545cef9e7d12f6ca6b761aa810261f7fd66ed739dd380e079b083c"} Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.597996 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49b9eed79545cef9e7d12f6ca6b761aa810261f7fd66ed739dd380e079b083c" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.597682 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n2zp9" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="registry-server" containerID="cri-o://b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124" gracePeriod=2 Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.727361 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x"] Sep 29 10:25:51 crc kubenswrapper[4922]: E0929 10:25:51.728484 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39297e68-ef0c-4e52-922d-28805d4a7171" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.728511 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="39297e68-ef0c-4e52-922d-28805d4a7171" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.728897 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="39297e68-ef0c-4e52-922d-28805d4a7171" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.729934 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.733888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.734153 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.734386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.734645 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.735091 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6zq2w" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.793802 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x"] Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871404 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871514 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzwv\" (UniqueName: \"kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871568 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871640 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871678 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.871809 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzwv\" (UniqueName: \"kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973683 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973709 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973732 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973788 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.973865 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.978878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.978941 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.983342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:51 crc kubenswrapper[4922]: I0929 10:25:51.984742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.000478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.004847 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzwv\" (UniqueName: \"kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.006572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dd76x\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.055347 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.318018 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.487052 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v8bh\" (UniqueName: \"kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh\") pod \"216a63db-46bb-4a6d-8e98-ebbd4731deee\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.487465 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities\") pod \"216a63db-46bb-4a6d-8e98-ebbd4731deee\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.487737 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content\") pod \"216a63db-46bb-4a6d-8e98-ebbd4731deee\" (UID: \"216a63db-46bb-4a6d-8e98-ebbd4731deee\") " Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.488679 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities" (OuterVolumeSpecName: "utilities") pod "216a63db-46bb-4a6d-8e98-ebbd4731deee" (UID: "216a63db-46bb-4a6d-8e98-ebbd4731deee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.492747 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh" (OuterVolumeSpecName: "kube-api-access-6v8bh") pod "216a63db-46bb-4a6d-8e98-ebbd4731deee" (UID: "216a63db-46bb-4a6d-8e98-ebbd4731deee"). InnerVolumeSpecName "kube-api-access-6v8bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.591807 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v8bh\" (UniqueName: \"kubernetes.io/projected/216a63db-46bb-4a6d-8e98-ebbd4731deee-kube-api-access-6v8bh\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.591849 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.617629 4922 generic.go:334] "Generic (PLEG): container finished" podID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerID="b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124" exitCode=0 Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.617684 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2zp9" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.617692 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerDied","Data":"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124"} Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.617736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2zp9" event={"ID":"216a63db-46bb-4a6d-8e98-ebbd4731deee","Type":"ContainerDied","Data":"924d5a57e779531e2223638802991c33a2465fc3828b536fa7d48928a5fe7e17"} Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.617773 4922 scope.go:117] "RemoveContainer" containerID="b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.641254 4922 scope.go:117] "RemoveContainer" containerID="10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.660384 4922 scope.go:117] "RemoveContainer" containerID="4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.695942 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x"] Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.697900 4922 scope.go:117] "RemoveContainer" containerID="b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124" Sep 29 10:25:52 crc kubenswrapper[4922]: E0929 10:25:52.698485 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124\": container with ID starting with b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124 not found: ID does not exist" containerID="b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.698530 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124"} err="failed to get container status \"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124\": rpc error: code = NotFound desc = could not find container \"b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124\": container with ID starting with b8f88620409932df0ecac5a454eb6ca4f6364c85980aec11bf08df02f8b86124 not found: ID does not exist" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.698557 4922 scope.go:117] "RemoveContainer" containerID="10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4" Sep 29 10:25:52 crc kubenswrapper[4922]: E0929 10:25:52.698903 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4\": container with ID starting with 10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4 not found: ID does not exist" containerID="10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.698935 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4"} err="failed to get container status \"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4\": rpc error: code = NotFound desc = could not find container \"10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4\": container with ID starting with 10f2a70d8d0b78824dc8a9f42fd1c60cfa89e3890116e056038dbfebf4631df4 not found: ID does not exist" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.698951 4922 scope.go:117] "RemoveContainer" containerID="4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff" Sep 29 10:25:52 crc kubenswrapper[4922]: E0929 10:25:52.699237 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff\": container with ID starting with 4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff not found: ID does not exist" containerID="4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff" Sep 29 10:25:52 crc kubenswrapper[4922]: I0929 10:25:52.699265 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff"} err="failed to get container status \"4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff\": rpc error: code = NotFound desc = could not find container \"4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff\": container with ID starting with 4c2b7e7b2beed504721fe387422b1d320e435601f8204cc670541f0db1b835ff not found: ID does not exist" Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.125770 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "216a63db-46bb-4a6d-8e98-ebbd4731deee" (UID: "216a63db-46bb-4a6d-8e98-ebbd4731deee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.212950 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/216a63db-46bb-4a6d-8e98-ebbd4731deee-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.436515 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.447466 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n2zp9"] Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.451876 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:25:53 crc kubenswrapper[4922]: E0929 10:25:53.452195 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.462039 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" path="/var/lib/kubelet/pods/216a63db-46bb-4a6d-8e98-ebbd4731deee/volumes" Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.629732 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" event={"ID":"a810e32e-1655-40f8-b445-9922b0d5603f","Type":"ContainerStarted","Data":"d22581e094c96ba887b974710e32d981220c74eed39d43b50707cce5267de41b"} Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.629780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" event={"ID":"a810e32e-1655-40f8-b445-9922b0d5603f","Type":"ContainerStarted","Data":"dec2a9ba0e16d122559a54b45df2c64ecfa941c9f0b454bca20e7825cc7ebe39"} Sep 29 10:25:53 crc kubenswrapper[4922]: I0929 10:25:53.662032 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" podStartSLOduration=2.150251284 podStartE2EDuration="2.662006743s" podCreationTimestamp="2025-09-29 10:25:51 +0000 UTC" firstStartedPulling="2025-09-29 10:25:52.713278977 +0000 UTC m=+2478.079509241" lastFinishedPulling="2025-09-29 10:25:53.225034426 +0000 UTC m=+2478.591264700" observedRunningTime="2025-09-29 10:25:53.657876412 +0000 UTC m=+2479.024106676" watchObservedRunningTime="2025-09-29 10:25:53.662006743 +0000 UTC m=+2479.028237017" Sep 29 10:26:07 crc kubenswrapper[4922]: I0929 10:26:07.451951 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:26:08 crc kubenswrapper[4922]: I0929 10:26:08.781152 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68"} Sep 29 10:28:20 crc kubenswrapper[4922]: I0929 10:28:20.107806 4922 generic.go:334] "Generic (PLEG): container finished" podID="a810e32e-1655-40f8-b445-9922b0d5603f" containerID="d22581e094c96ba887b974710e32d981220c74eed39d43b50707cce5267de41b" exitCode=0 Sep 29 10:28:20 crc kubenswrapper[4922]: I0929 10:28:20.108002 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" event={"ID":"a810e32e-1655-40f8-b445-9922b0d5603f","Type":"ContainerDied","Data":"d22581e094c96ba887b974710e32d981220c74eed39d43b50707cce5267de41b"} Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.593408 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.659714 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.659852 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlzwv\" (UniqueName: \"kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.659881 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.659919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.659979 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.660031 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.660124 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory\") pod \"a810e32e-1655-40f8-b445-9922b0d5603f\" (UID: \"a810e32e-1655-40f8-b445-9922b0d5603f\") " Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.669289 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv" (OuterVolumeSpecName: "kube-api-access-qlzwv") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "kube-api-access-qlzwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.671295 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.691413 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.704433 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.704663 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory" (OuterVolumeSpecName: "inventory") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.716054 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.727758 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a810e32e-1655-40f8-b445-9922b0d5603f" (UID: "a810e32e-1655-40f8-b445-9922b0d5603f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763156 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763185 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlzwv\" (UniqueName: \"kubernetes.io/projected/a810e32e-1655-40f8-b445-9922b0d5603f-kube-api-access-qlzwv\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763198 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763208 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763219 4922 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763229 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:21 crc kubenswrapper[4922]: I0929 10:28:21.763237 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a810e32e-1655-40f8-b445-9922b0d5603f-inventory\") on node \"crc\" DevicePath \"\"" Sep 29 10:28:22 crc kubenswrapper[4922]: I0929 10:28:22.131493 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" event={"ID":"a810e32e-1655-40f8-b445-9922b0d5603f","Type":"ContainerDied","Data":"dec2a9ba0e16d122559a54b45df2c64ecfa941c9f0b454bca20e7825cc7ebe39"} Sep 29 10:28:22 crc kubenswrapper[4922]: I0929 10:28:22.131552 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec2a9ba0e16d122559a54b45df2c64ecfa941c9f0b454bca20e7825cc7ebe39" Sep 29 10:28:22 crc kubenswrapper[4922]: I0929 10:28:22.131639 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dd76x" Sep 29 10:28:29 crc kubenswrapper[4922]: I0929 10:28:29.070941 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:28:29 crc kubenswrapper[4922]: I0929 10:28:29.071559 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:28:59 crc kubenswrapper[4922]: I0929 10:28:59.071412 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:28:59 crc kubenswrapper[4922]: I0929 10:28:59.072551 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.702465 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:29:11 crc kubenswrapper[4922]: E0929 10:29:11.704889 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="registry-server" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.705011 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="registry-server" Sep 29 10:29:11 crc kubenswrapper[4922]: E0929 10:29:11.705136 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a810e32e-1655-40f8-b445-9922b0d5603f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.706231 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a810e32e-1655-40f8-b445-9922b0d5603f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:29:11 crc kubenswrapper[4922]: E0929 10:29:11.706344 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="extract-utilities" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.706428 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="extract-utilities" Sep 29 10:29:11 crc kubenswrapper[4922]: E0929 10:29:11.706507 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="extract-content" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.706565 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="extract-content" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.707032 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a810e32e-1655-40f8-b445-9922b0d5603f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.707127 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="216a63db-46bb-4a6d-8e98-ebbd4731deee" containerName="registry-server" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.708068 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.711092 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.711354 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.711892 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7l6b4" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.711947 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.720668 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.805521 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.805593 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.805887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806074 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vfmb\" (UniqueName: \"kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806364 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806508 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806591 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806689 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.806807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909649 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909687 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909716 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909816 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909923 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.909969 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.910018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vfmb\" (UniqueName: \"kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.910453 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.911192 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.911456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.911584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.911948 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.917363 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.918062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.919998 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.935108 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vfmb\" (UniqueName: \"kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:11 crc kubenswrapper[4922]: I0929 10:29:11.948931 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " pod="openstack/tempest-tests-tempest" Sep 29 10:29:12 crc kubenswrapper[4922]: I0929 10:29:12.029937 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:29:12 crc kubenswrapper[4922]: I0929 10:29:12.499236 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 29 10:29:12 crc kubenswrapper[4922]: I0929 10:29:12.510391 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:29:12 crc kubenswrapper[4922]: I0929 10:29:12.655429 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab","Type":"ContainerStarted","Data":"d36e5fa74e7ff5b1cf4c194dceff70e5671af408407c46679581ab7105d5f1f3"} Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.714996 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.718233 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.741388 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.797024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.797252 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqkm\" (UniqueName: \"kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.797393 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.900041 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqkm\" (UniqueName: \"kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.900145 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.900301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.900953 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.900973 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:18 crc kubenswrapper[4922]: I0929 10:29:18.931732 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqkm\" (UniqueName: \"kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm\") pod \"redhat-operators-bnjzn\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:19 crc kubenswrapper[4922]: I0929 10:29:19.046571 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:19 crc kubenswrapper[4922]: I0929 10:29:19.856990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.525650 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.549584 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.549718 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.645719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.645914 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77nf6\" (UniqueName: \"kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.645982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.748526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77nf6\" (UniqueName: \"kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.748599 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.748809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.749465 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.750022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.775685 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77nf6\" (UniqueName: \"kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6\") pod \"certified-operators-qxdpj\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.814396 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerID="bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114" exitCode=0 Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.814454 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerDied","Data":"bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114"} Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.814483 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerStarted","Data":"58a543c85959d64f5dc6fb54d0e616cb0e0a4b60795ddb99b593770982c1c502"} Sep 29 10:29:20 crc kubenswrapper[4922]: I0929 10:29:20.881535 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:29:21 crc kubenswrapper[4922]: I0929 10:29:21.424195 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:29:21 crc kubenswrapper[4922]: W0929 10:29:21.430640 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d79e93_f1e7_4ad3_83ed_2983deb61ef7.slice/crio-94de72322f0f9ec5f9079b9214f45aa5672f05640365e764cdc33a824990bd37 WatchSource:0}: Error finding container 94de72322f0f9ec5f9079b9214f45aa5672f05640365e764cdc33a824990bd37: Status 404 returned error can't find the container with id 94de72322f0f9ec5f9079b9214f45aa5672f05640365e764cdc33a824990bd37 Sep 29 10:29:21 crc kubenswrapper[4922]: I0929 10:29:21.827944 4922 generic.go:334] "Generic (PLEG): container finished" podID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerID="7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83" exitCode=0 Sep 29 10:29:21 crc kubenswrapper[4922]: I0929 10:29:21.828018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerDied","Data":"7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83"} Sep 29 10:29:21 crc kubenswrapper[4922]: I0929 10:29:21.828319 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerStarted","Data":"94de72322f0f9ec5f9079b9214f45aa5672f05640365e764cdc33a824990bd37"} Sep 29 10:29:22 crc kubenswrapper[4922]: I0929 10:29:22.845553 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerID="0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c" exitCode=0 Sep 29 10:29:22 crc kubenswrapper[4922]: I0929 10:29:22.845637 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerDied","Data":"0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c"} Sep 29 10:29:24 crc kubenswrapper[4922]: I0929 10:29:24.918724 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:29:24 crc kubenswrapper[4922]: I0929 10:29:24.921618 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:24 crc kubenswrapper[4922]: I0929 10:29:24.936847 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.062465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.062640 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h597q\" (UniqueName: \"kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.062775 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.165191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.165397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.165465 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h597q\" (UniqueName: \"kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.165700 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.166081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.195170 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h597q\" (UniqueName: \"kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q\") pod \"redhat-marketplace-gvs42\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:25 crc kubenswrapper[4922]: I0929 10:29:25.249285 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.537857 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:29:26 crc kubenswrapper[4922]: W0929 10:29:26.550941 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a21b6b9_f871_4830_853f_c279214de7af.slice/crio-739b1b5aea8372a855a43fcefb81e6793395d5e8265782c13c239e85e297509e WatchSource:0}: Error finding container 739b1b5aea8372a855a43fcefb81e6793395d5e8265782c13c239e85e297509e: Status 404 returned error can't find the container with id 739b1b5aea8372a855a43fcefb81e6793395d5e8265782c13c239e85e297509e Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.892401 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerStarted","Data":"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be"} Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.894813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerStarted","Data":"739b1b5aea8372a855a43fcefb81e6793395d5e8265782c13c239e85e297509e"} Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.897446 4922 generic.go:334] "Generic (PLEG): container finished" podID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerID="bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85" exitCode=0 Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.897540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerDied","Data":"bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85"} Sep 29 10:29:26 crc kubenswrapper[4922]: I0929 10:29:26.919439 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bnjzn" podStartSLOduration=3.703678956 podStartE2EDuration="8.919376106s" podCreationTimestamp="2025-09-29 10:29:18 +0000 UTC" firstStartedPulling="2025-09-29 10:29:20.821113593 +0000 UTC m=+2686.187343857" lastFinishedPulling="2025-09-29 10:29:26.036810743 +0000 UTC m=+2691.403041007" observedRunningTime="2025-09-29 10:29:26.914883914 +0000 UTC m=+2692.281114178" watchObservedRunningTime="2025-09-29 10:29:26.919376106 +0000 UTC m=+2692.285606390" Sep 29 10:29:27 crc kubenswrapper[4922]: I0929 10:29:27.913802 4922 generic.go:334] "Generic (PLEG): container finished" podID="7a21b6b9-f871-4830-853f-c279214de7af" containerID="1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc" exitCode=0 Sep 29 10:29:27 crc kubenswrapper[4922]: I0929 10:29:27.913858 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerDied","Data":"1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc"} Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.047046 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.047107 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.070380 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.070435 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.070477 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.071248 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.071312 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68" gracePeriod=600 Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.940265 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68" exitCode=0 Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.940501 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68"} Sep 29 10:29:29 crc kubenswrapper[4922]: I0929 10:29:29.940627 4922 scope.go:117] "RemoveContainer" containerID="9b61128ccb10d39facb3c8b70bd5da321f9b540f3b0b59c40d40d605eea817b5" Sep 29 10:29:30 crc kubenswrapper[4922]: I0929 10:29:30.113933 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" probeResult="failure" output=< Sep 29 10:29:30 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:29:30 crc kubenswrapper[4922]: > Sep 29 10:29:40 crc kubenswrapper[4922]: I0929 10:29:40.100183 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" probeResult="failure" output=< Sep 29 10:29:40 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:29:40 crc kubenswrapper[4922]: > Sep 29 10:29:50 crc kubenswrapper[4922]: I0929 10:29:50.093015 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" probeResult="failure" output=< Sep 29 10:29:50 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:29:50 crc kubenswrapper[4922]: > Sep 29 10:29:54 crc kubenswrapper[4922]: E0929 10:29:54.009458 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Sep 29 10:29:54 crc kubenswrapper[4922]: E0929 10:29:54.010303 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vfmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b34ceaf2-30f5-4be7-8806-fad8a2bd21ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 10:29:54 crc kubenswrapper[4922]: E0929 10:29:54.011467 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" Sep 29 10:29:54 crc kubenswrapper[4922]: E0929 10:29:54.220855 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" Sep 29 10:29:55 crc kubenswrapper[4922]: I0929 10:29:55.228339 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerStarted","Data":"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee"} Sep 29 10:29:55 crc kubenswrapper[4922]: I0929 10:29:55.231236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530"} Sep 29 10:29:55 crc kubenswrapper[4922]: I0929 10:29:55.252963 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxdpj" podStartSLOduration=3.074173084 podStartE2EDuration="35.252940458s" podCreationTimestamp="2025-09-29 10:29:20 +0000 UTC" firstStartedPulling="2025-09-29 10:29:21.830327717 +0000 UTC m=+2687.196557981" lastFinishedPulling="2025-09-29 10:29:54.009095091 +0000 UTC m=+2719.375325355" observedRunningTime="2025-09-29 10:29:55.246753271 +0000 UTC m=+2720.612983555" watchObservedRunningTime="2025-09-29 10:29:55.252940458 +0000 UTC m=+2720.619170712" Sep 29 10:29:58 crc kubenswrapper[4922]: I0929 10:29:58.262104 4922 generic.go:334] "Generic (PLEG): container finished" podID="7a21b6b9-f871-4830-853f-c279214de7af" containerID="3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0" exitCode=0 Sep 29 10:29:58 crc kubenswrapper[4922]: I0929 10:29:58.262228 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerDied","Data":"3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0"} Sep 29 10:29:59 crc kubenswrapper[4922]: I0929 10:29:59.275387 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerStarted","Data":"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d"} Sep 29 10:29:59 crc kubenswrapper[4922]: I0929 10:29:59.303386 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gvs42" podStartSLOduration=30.524279744 podStartE2EDuration="35.303363168s" podCreationTimestamp="2025-09-29 10:29:24 +0000 UTC" firstStartedPulling="2025-09-29 10:29:53.89422933 +0000 UTC m=+2719.260459594" lastFinishedPulling="2025-09-29 10:29:58.673312754 +0000 UTC m=+2724.039543018" observedRunningTime="2025-09-29 10:29:59.29674546 +0000 UTC m=+2724.662975734" watchObservedRunningTime="2025-09-29 10:29:59.303363168 +0000 UTC m=+2724.669593442" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.102956 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" probeResult="failure" output=< Sep 29 10:30:00 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:30:00 crc kubenswrapper[4922]: > Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.151772 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l"] Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.153388 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.156812 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.160437 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.165328 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l"] Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.202001 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.202074 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.202271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jmz\" (UniqueName: \"kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.303772 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.303905 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.304017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jmz\" (UniqueName: \"kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.304951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.311191 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.321442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jmz\" (UniqueName: \"kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz\") pod \"collect-profiles-29319030-4p42l\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.474522 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.882416 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.882761 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:00 crc kubenswrapper[4922]: I0929 10:30:00.924227 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l"] Sep 29 10:30:00 crc kubenswrapper[4922]: W0929 10:30:00.928050 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d032ff9_2b1f_4f9c_8e41_61d52020bbf5.slice/crio-b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc WatchSource:0}: Error finding container b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc: Status 404 returned error can't find the container with id b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc Sep 29 10:30:01 crc kubenswrapper[4922]: I0929 10:30:01.295233 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" event={"ID":"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5","Type":"ContainerStarted","Data":"d9509ce3cd360a05f94a2e84e0e1ea1f6f6836c679d0bd838812ad5394527421"} Sep 29 10:30:01 crc kubenswrapper[4922]: I0929 10:30:01.295592 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" event={"ID":"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5","Type":"ContainerStarted","Data":"b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc"} Sep 29 10:30:01 crc kubenswrapper[4922]: I0929 10:30:01.946987 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qxdpj" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="registry-server" probeResult="failure" output=< Sep 29 10:30:01 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:30:01 crc kubenswrapper[4922]: > Sep 29 10:30:02 crc kubenswrapper[4922]: I0929 10:30:02.314610 4922 generic.go:334] "Generic (PLEG): container finished" podID="3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" containerID="d9509ce3cd360a05f94a2e84e0e1ea1f6f6836c679d0bd838812ad5394527421" exitCode=0 Sep 29 10:30:02 crc kubenswrapper[4922]: I0929 10:30:02.314944 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" event={"ID":"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5","Type":"ContainerDied","Data":"d9509ce3cd360a05f94a2e84e0e1ea1f6f6836c679d0bd838812ad5394527421"} Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.648572 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.677560 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume\") pod \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.685406 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" (UID: "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.778760 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume\") pod \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.778910 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jmz\" (UniqueName: \"kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz\") pod \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\" (UID: \"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5\") " Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.779359 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.779596 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" (UID: "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.783276 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz" (OuterVolumeSpecName: "kube-api-access-l2jmz") pod "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" (UID: "3d032ff9-2b1f-4f9c-8e41-61d52020bbf5"). InnerVolumeSpecName "kube-api-access-l2jmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.881551 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:03 crc kubenswrapper[4922]: I0929 10:30:03.881594 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2jmz\" (UniqueName: \"kubernetes.io/projected/3d032ff9-2b1f-4f9c-8e41-61d52020bbf5-kube-api-access-l2jmz\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:04 crc kubenswrapper[4922]: I0929 10:30:04.345991 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" Sep 29 10:30:04 crc kubenswrapper[4922]: I0929 10:30:04.345998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319030-4p42l" event={"ID":"3d032ff9-2b1f-4f9c-8e41-61d52020bbf5","Type":"ContainerDied","Data":"b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc"} Sep 29 10:30:04 crc kubenswrapper[4922]: I0929 10:30:04.346724 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7159b49c41a96658d6e38b46af9adc0ccdb6fa2b1e3695974acdb5310c85bdc" Sep 29 10:30:04 crc kubenswrapper[4922]: I0929 10:30:04.394497 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn"] Sep 29 10:30:04 crc kubenswrapper[4922]: I0929 10:30:04.402025 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29318985-bq6tn"] Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.249480 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.249927 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.299934 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.401797 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.465584 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24b1532-d6be-4a8e-a843-742f6328c431" path="/var/lib/kubelet/pods/a24b1532-d6be-4a8e-a843-742f6328c431/volumes" Sep 29 10:30:05 crc kubenswrapper[4922]: I0929 10:30:05.533350 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.380647 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gvs42" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="registry-server" containerID="cri-o://a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d" gracePeriod=2 Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.878974 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.978223 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities\") pod \"7a21b6b9-f871-4830-853f-c279214de7af\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.978607 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h597q\" (UniqueName: \"kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q\") pod \"7a21b6b9-f871-4830-853f-c279214de7af\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.978924 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content\") pod \"7a21b6b9-f871-4830-853f-c279214de7af\" (UID: \"7a21b6b9-f871-4830-853f-c279214de7af\") " Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.980174 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities" (OuterVolumeSpecName: "utilities") pod "7a21b6b9-f871-4830-853f-c279214de7af" (UID: "7a21b6b9-f871-4830-853f-c279214de7af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.985600 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q" (OuterVolumeSpecName: "kube-api-access-h597q") pod "7a21b6b9-f871-4830-853f-c279214de7af" (UID: "7a21b6b9-f871-4830-853f-c279214de7af"). InnerVolumeSpecName "kube-api-access-h597q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:07 crc kubenswrapper[4922]: I0929 10:30:07.994307 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a21b6b9-f871-4830-853f-c279214de7af" (UID: "7a21b6b9-f871-4830-853f-c279214de7af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.080928 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.080968 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a21b6b9-f871-4830-853f-c279214de7af-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.080982 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h597q\" (UniqueName: \"kubernetes.io/projected/7a21b6b9-f871-4830-853f-c279214de7af-kube-api-access-h597q\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.393739 4922 generic.go:334] "Generic (PLEG): container finished" podID="7a21b6b9-f871-4830-853f-c279214de7af" containerID="a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d" exitCode=0 Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.393807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerDied","Data":"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d"} Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.393910 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvs42" event={"ID":"7a21b6b9-f871-4830-853f-c279214de7af","Type":"ContainerDied","Data":"739b1b5aea8372a855a43fcefb81e6793395d5e8265782c13c239e85e297509e"} Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.393943 4922 scope.go:117] "RemoveContainer" containerID="a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.393936 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvs42" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.419225 4922 scope.go:117] "RemoveContainer" containerID="3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.434281 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.445404 4922 scope.go:117] "RemoveContainer" containerID="1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.446633 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvs42"] Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.498904 4922 scope.go:117] "RemoveContainer" containerID="a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d" Sep 29 10:30:08 crc kubenswrapper[4922]: E0929 10:30:08.499689 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d\": container with ID starting with a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d not found: ID does not exist" containerID="a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.499732 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d"} err="failed to get container status \"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d\": rpc error: code = NotFound desc = could not find container \"a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d\": container with ID starting with a9abac1de2e48dc28f21a27b4575d4cbd294a680cfe699470424a2104497ef0d not found: ID does not exist" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.499762 4922 scope.go:117] "RemoveContainer" containerID="3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0" Sep 29 10:30:08 crc kubenswrapper[4922]: E0929 10:30:08.500249 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0\": container with ID starting with 3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0 not found: ID does not exist" containerID="3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.500276 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0"} err="failed to get container status \"3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0\": rpc error: code = NotFound desc = could not find container \"3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0\": container with ID starting with 3665262b65f87c88478263367f0717c8ad9927606709277aa45fe2f31fbe91a0 not found: ID does not exist" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.500295 4922 scope.go:117] "RemoveContainer" containerID="1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc" Sep 29 10:30:08 crc kubenswrapper[4922]: E0929 10:30:08.500493 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc\": container with ID starting with 1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc not found: ID does not exist" containerID="1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.500515 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc"} err="failed to get container status \"1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc\": rpc error: code = NotFound desc = could not find container \"1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc\": container with ID starting with 1350331a5391831d28c848d25a2786ae138a1496127cba05dcd968259cbf11cc not found: ID does not exist" Sep 29 10:30:08 crc kubenswrapper[4922]: I0929 10:30:08.936262 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 29 10:30:09 crc kubenswrapper[4922]: I0929 10:30:09.472462 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a21b6b9-f871-4830-853f-c279214de7af" path="/var/lib/kubelet/pods/7a21b6b9-f871-4830-853f-c279214de7af/volumes" Sep 29 10:30:10 crc kubenswrapper[4922]: I0929 10:30:10.114157 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" probeResult="failure" output=< Sep 29 10:30:10 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Sep 29 10:30:10 crc kubenswrapper[4922]: > Sep 29 10:30:10 crc kubenswrapper[4922]: I0929 10:30:10.433932 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab","Type":"ContainerStarted","Data":"e3e40029d4e3ab2cf562fc4432270a3d3a7655bfc09b70c5f3a8235848974c56"} Sep 29 10:30:10 crc kubenswrapper[4922]: I0929 10:30:10.465797 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.042769517 podStartE2EDuration="1m0.465774812s" podCreationTimestamp="2025-09-29 10:29:10 +0000 UTC" firstStartedPulling="2025-09-29 10:29:12.510139993 +0000 UTC m=+2677.876370257" lastFinishedPulling="2025-09-29 10:30:08.933145258 +0000 UTC m=+2734.299375552" observedRunningTime="2025-09-29 10:30:10.463665166 +0000 UTC m=+2735.829895440" watchObservedRunningTime="2025-09-29 10:30:10.465774812 +0000 UTC m=+2735.832005076" Sep 29 10:30:10 crc kubenswrapper[4922]: I0929 10:30:10.943316 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:11 crc kubenswrapper[4922]: I0929 10:30:11.001730 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:11 crc kubenswrapper[4922]: I0929 10:30:11.938272 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:30:12 crc kubenswrapper[4922]: I0929 10:30:12.459569 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qxdpj" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="registry-server" containerID="cri-o://56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee" gracePeriod=2 Sep 29 10:30:12 crc kubenswrapper[4922]: I0929 10:30:12.972302 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.002338 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77nf6\" (UniqueName: \"kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6\") pod \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.002605 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities\") pod \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.002704 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content\") pod \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\" (UID: \"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7\") " Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.004070 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities" (OuterVolumeSpecName: "utilities") pod "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" (UID: "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.009458 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6" (OuterVolumeSpecName: "kube-api-access-77nf6") pod "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" (UID: "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7"). InnerVolumeSpecName "kube-api-access-77nf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.062658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" (UID: "f0d79e93-f1e7-4ad3-83ed-2983deb61ef7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.105353 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.105429 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.105444 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77nf6\" (UniqueName: \"kubernetes.io/projected/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7-kube-api-access-77nf6\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.470207 4922 generic.go:334] "Generic (PLEG): container finished" podID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerID="56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee" exitCode=0 Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.470255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerDied","Data":"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee"} Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.470288 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxdpj" event={"ID":"f0d79e93-f1e7-4ad3-83ed-2983deb61ef7","Type":"ContainerDied","Data":"94de72322f0f9ec5f9079b9214f45aa5672f05640365e764cdc33a824990bd37"} Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.470299 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxdpj" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.470311 4922 scope.go:117] "RemoveContainer" containerID="56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.498158 4922 scope.go:117] "RemoveContainer" containerID="bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.506320 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.515283 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qxdpj"] Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.536620 4922 scope.go:117] "RemoveContainer" containerID="7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.586068 4922 scope.go:117] "RemoveContainer" containerID="56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee" Sep 29 10:30:13 crc kubenswrapper[4922]: E0929 10:30:13.587664 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee\": container with ID starting with 56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee not found: ID does not exist" containerID="56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.587737 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee"} err="failed to get container status \"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee\": rpc error: code = NotFound desc = could not find container \"56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee\": container with ID starting with 56e100e9911884b7468179e13a8e99bbc0864e247c44ba41bfccb3b80e350dee not found: ID does not exist" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.587777 4922 scope.go:117] "RemoveContainer" containerID="bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85" Sep 29 10:30:13 crc kubenswrapper[4922]: E0929 10:30:13.588682 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85\": container with ID starting with bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85 not found: ID does not exist" containerID="bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.588727 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85"} err="failed to get container status \"bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85\": rpc error: code = NotFound desc = could not find container \"bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85\": container with ID starting with bc5695aa9daae3559018aac3b523acddbc7fe99da8babe40b1b2b05342ea3f85 not found: ID does not exist" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.589104 4922 scope.go:117] "RemoveContainer" containerID="7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83" Sep 29 10:30:13 crc kubenswrapper[4922]: E0929 10:30:13.589982 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83\": container with ID starting with 7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83 not found: ID does not exist" containerID="7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83" Sep 29 10:30:13 crc kubenswrapper[4922]: I0929 10:30:13.590058 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83"} err="failed to get container status \"7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83\": rpc error: code = NotFound desc = could not find container \"7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83\": container with ID starting with 7fa9e7d79a27de2f67daa6ba2dfcfd008a40fd61ecd79b5961f9f287fa787c83 not found: ID does not exist" Sep 29 10:30:15 crc kubenswrapper[4922]: I0929 10:30:15.466594 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" path="/var/lib/kubelet/pods/f0d79e93-f1e7-4ad3-83ed-2983deb61ef7/volumes" Sep 29 10:30:19 crc kubenswrapper[4922]: I0929 10:30:19.118968 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:30:19 crc kubenswrapper[4922]: I0929 10:30:19.173698 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:30:19 crc kubenswrapper[4922]: I0929 10:30:19.938259 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:30:20 crc kubenswrapper[4922]: I0929 10:30:20.553229 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bnjzn" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" containerID="cri-o://432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be" gracePeriod=2 Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.079412 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.178380 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcqkm\" (UniqueName: \"kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm\") pod \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.178676 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content\") pod \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.178871 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities\") pod \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\" (UID: \"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149\") " Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.180086 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities" (OuterVolumeSpecName: "utilities") pod "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" (UID: "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.190037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm" (OuterVolumeSpecName: "kube-api-access-vcqkm") pod "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" (UID: "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149"). InnerVolumeSpecName "kube-api-access-vcqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.280806 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.280850 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcqkm\" (UniqueName: \"kubernetes.io/projected/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-kube-api-access-vcqkm\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.301463 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" (UID: "b4b4fd3d-fef7-4833-8d37-c44fb6d8e149"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.383073 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.564124 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerID="432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be" exitCode=0 Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.564169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerDied","Data":"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be"} Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.564206 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnjzn" event={"ID":"b4b4fd3d-fef7-4833-8d37-c44fb6d8e149","Type":"ContainerDied","Data":"58a543c85959d64f5dc6fb54d0e616cb0e0a4b60795ddb99b593770982c1c502"} Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.564226 4922 scope.go:117] "RemoveContainer" containerID="432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.564347 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnjzn" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.594068 4922 scope.go:117] "RemoveContainer" containerID="0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.600356 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.613104 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bnjzn"] Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.614955 4922 scope.go:117] "RemoveContainer" containerID="bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.687542 4922 scope.go:117] "RemoveContainer" containerID="432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be" Sep 29 10:30:21 crc kubenswrapper[4922]: E0929 10:30:21.688146 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be\": container with ID starting with 432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be not found: ID does not exist" containerID="432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.688213 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be"} err="failed to get container status \"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be\": rpc error: code = NotFound desc = could not find container \"432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be\": container with ID starting with 432beb5a918c492c70b447fa2def9a5a2b3bd39aeb0a0f86900c299b5682f5be not found: ID does not exist" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.688259 4922 scope.go:117] "RemoveContainer" containerID="0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c" Sep 29 10:30:21 crc kubenswrapper[4922]: E0929 10:30:21.688978 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c\": container with ID starting with 0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c not found: ID does not exist" containerID="0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.689032 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c"} err="failed to get container status \"0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c\": rpc error: code = NotFound desc = could not find container \"0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c\": container with ID starting with 0ca191c5d2b4ca8cfe58e3732a481b9d9af58eb5502511a38e5b3fb8fea25c7c not found: ID does not exist" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.689072 4922 scope.go:117] "RemoveContainer" containerID="bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114" Sep 29 10:30:21 crc kubenswrapper[4922]: E0929 10:30:21.689496 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114\": container with ID starting with bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114 not found: ID does not exist" containerID="bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114" Sep 29 10:30:21 crc kubenswrapper[4922]: I0929 10:30:21.689531 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114"} err="failed to get container status \"bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114\": rpc error: code = NotFound desc = could not find container \"bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114\": container with ID starting with bcb5479c176630a1bea18f1236d6815fb1a8492c5535ddd690ceae4376a0f114 not found: ID does not exist" Sep 29 10:30:23 crc kubenswrapper[4922]: I0929 10:30:23.464322 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" path="/var/lib/kubelet/pods/b4b4fd3d-fef7-4833-8d37-c44fb6d8e149/volumes" Sep 29 10:30:51 crc kubenswrapper[4922]: I0929 10:30:51.715629 4922 scope.go:117] "RemoveContainer" containerID="48650bacf61792045d993e8c47659b1e73eb28f0e910022ea3053b405786a628" Sep 29 10:31:59 crc kubenswrapper[4922]: I0929 10:31:59.071319 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:31:59 crc kubenswrapper[4922]: I0929 10:31:59.072104 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:32:29 crc kubenswrapper[4922]: I0929 10:32:29.070601 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:32:29 crc kubenswrapper[4922]: I0929 10:32:29.071273 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:32:59 crc kubenswrapper[4922]: I0929 10:32:59.070712 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:32:59 crc kubenswrapper[4922]: I0929 10:32:59.071305 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:32:59 crc kubenswrapper[4922]: I0929 10:32:59.071362 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:32:59 crc kubenswrapper[4922]: I0929 10:32:59.072124 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:32:59 crc kubenswrapper[4922]: I0929 10:32:59.072187 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" gracePeriod=600 Sep 29 10:32:59 crc kubenswrapper[4922]: E0929 10:32:59.924554 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:33:00 crc kubenswrapper[4922]: I0929 10:33:00.273522 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" exitCode=0 Sep 29 10:33:00 crc kubenswrapper[4922]: I0929 10:33:00.273596 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530"} Sep 29 10:33:00 crc kubenswrapper[4922]: I0929 10:33:00.274069 4922 scope.go:117] "RemoveContainer" containerID="f44499784b93677d141253b2016af7d4dc181e349e956779442f4f056a0daa68" Sep 29 10:33:00 crc kubenswrapper[4922]: I0929 10:33:00.274866 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:33:00 crc kubenswrapper[4922]: E0929 10:33:00.275117 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:33:13 crc kubenswrapper[4922]: I0929 10:33:13.452379 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:33:13 crc kubenswrapper[4922]: E0929 10:33:13.454953 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:33:27 crc kubenswrapper[4922]: I0929 10:33:27.452785 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:33:27 crc kubenswrapper[4922]: E0929 10:33:27.453992 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:33:42 crc kubenswrapper[4922]: I0929 10:33:42.452823 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:33:42 crc kubenswrapper[4922]: E0929 10:33:42.453978 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:33:54 crc kubenswrapper[4922]: I0929 10:33:54.453040 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:33:54 crc kubenswrapper[4922]: E0929 10:33:54.454301 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:34:05 crc kubenswrapper[4922]: I0929 10:34:05.457789 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:34:05 crc kubenswrapper[4922]: E0929 10:34:05.458636 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:34:17 crc kubenswrapper[4922]: I0929 10:34:17.453424 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:34:17 crc kubenswrapper[4922]: E0929 10:34:17.454409 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:34:30 crc kubenswrapper[4922]: I0929 10:34:30.452236 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:34:30 crc kubenswrapper[4922]: E0929 10:34:30.453132 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:34:43 crc kubenswrapper[4922]: I0929 10:34:43.452479 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:34:43 crc kubenswrapper[4922]: E0929 10:34:43.453318 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:34:54 crc kubenswrapper[4922]: I0929 10:34:54.451888 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:34:54 crc kubenswrapper[4922]: E0929 10:34:54.452969 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:35:06 crc kubenswrapper[4922]: I0929 10:35:06.452032 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:35:06 crc kubenswrapper[4922]: E0929 10:35:06.453209 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:35:20 crc kubenswrapper[4922]: I0929 10:35:20.451997 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:35:20 crc kubenswrapper[4922]: E0929 10:35:20.452938 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:35:35 crc kubenswrapper[4922]: I0929 10:35:35.458827 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:35:35 crc kubenswrapper[4922]: E0929 10:35:35.459722 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:35:46 crc kubenswrapper[4922]: I0929 10:35:46.452248 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:35:46 crc kubenswrapper[4922]: E0929 10:35:46.453917 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:35:57 crc kubenswrapper[4922]: I0929 10:35:57.452570 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:35:57 crc kubenswrapper[4922]: E0929 10:35:57.453675 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:36:12 crc kubenswrapper[4922]: I0929 10:36:12.452598 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:36:12 crc kubenswrapper[4922]: E0929 10:36:12.453406 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:36:24 crc kubenswrapper[4922]: I0929 10:36:24.451557 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:36:24 crc kubenswrapper[4922]: E0929 10:36:24.452358 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.453567 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.454750 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.574772 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.575282 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" containerName="collect-profiles" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.575307 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" containerName="collect-profiles" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.575321 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.575330 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576790 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576815 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576845 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576855 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576877 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576885 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576897 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576905 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576932 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576940 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="extract-utilities" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.576953 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.576960 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.577003 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577012 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: E0929 10:36:39.577028 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577035 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="extract-content" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577338 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a21b6b9-f871-4830-853f-c279214de7af" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577359 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d79e93-f1e7-4ad3-83ed-2983deb61ef7" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577380 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d032ff9-2b1f-4f9c-8e41-61d52020bbf5" containerName="collect-profiles" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.577400 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b4fd3d-fef7-4833-8d37-c44fb6d8e149" containerName="registry-server" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.580104 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.589742 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.651786 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2h7\" (UniqueName: \"kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.652338 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.652377 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.753847 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2h7\" (UniqueName: \"kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.754356 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.754464 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.755418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.755579 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.784664 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2h7\" (UniqueName: \"kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7\") pod \"community-operators-k8gvc\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:39 crc kubenswrapper[4922]: I0929 10:36:39.901801 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:40 crc kubenswrapper[4922]: I0929 10:36:40.496461 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:41 crc kubenswrapper[4922]: I0929 10:36:41.360744 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerID="c7bf540db0462c915aa3c2f1831bc13f8d570a9fa02dde8c13a89a841d7df099" exitCode=0 Sep 29 10:36:41 crc kubenswrapper[4922]: I0929 10:36:41.360937 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerDied","Data":"c7bf540db0462c915aa3c2f1831bc13f8d570a9fa02dde8c13a89a841d7df099"} Sep 29 10:36:41 crc kubenswrapper[4922]: I0929 10:36:41.361306 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerStarted","Data":"cb318942ced098d056f33c9a6b8180cac2056a8d9fcb89ffe9da063b52ca09fd"} Sep 29 10:36:41 crc kubenswrapper[4922]: I0929 10:36:41.365148 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:36:42 crc kubenswrapper[4922]: I0929 10:36:42.374329 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerStarted","Data":"ae9a750c8e2017d68c05a33ee48051c9a9e886de0db8be68880c12b6daed4224"} Sep 29 10:36:43 crc kubenswrapper[4922]: I0929 10:36:43.384560 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerID="ae9a750c8e2017d68c05a33ee48051c9a9e886de0db8be68880c12b6daed4224" exitCode=0 Sep 29 10:36:43 crc kubenswrapper[4922]: I0929 10:36:43.384630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerDied","Data":"ae9a750c8e2017d68c05a33ee48051c9a9e886de0db8be68880c12b6daed4224"} Sep 29 10:36:44 crc kubenswrapper[4922]: I0929 10:36:44.395884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerStarted","Data":"10eb741e68dd4bf4b6e29484f179088872232d9b28d2beedd2bb4d8ccb463dc4"} Sep 29 10:36:44 crc kubenswrapper[4922]: I0929 10:36:44.417955 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8gvc" podStartSLOduration=2.981621901 podStartE2EDuration="5.417927782s" podCreationTimestamp="2025-09-29 10:36:39 +0000 UTC" firstStartedPulling="2025-09-29 10:36:41.364708371 +0000 UTC m=+3126.730938655" lastFinishedPulling="2025-09-29 10:36:43.801014272 +0000 UTC m=+3129.167244536" observedRunningTime="2025-09-29 10:36:44.415629472 +0000 UTC m=+3129.781859756" watchObservedRunningTime="2025-09-29 10:36:44.417927782 +0000 UTC m=+3129.784158046" Sep 29 10:36:49 crc kubenswrapper[4922]: I0929 10:36:49.902643 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:49 crc kubenswrapper[4922]: I0929 10:36:49.903627 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:49 crc kubenswrapper[4922]: I0929 10:36:49.949523 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:50 crc kubenswrapper[4922]: I0929 10:36:50.452273 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:36:50 crc kubenswrapper[4922]: E0929 10:36:50.452627 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:36:50 crc kubenswrapper[4922]: I0929 10:36:50.503760 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:50 crc kubenswrapper[4922]: I0929 10:36:50.561155 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:52 crc kubenswrapper[4922]: I0929 10:36:52.467651 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8gvc" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="registry-server" containerID="cri-o://10eb741e68dd4bf4b6e29484f179088872232d9b28d2beedd2bb4d8ccb463dc4" gracePeriod=2 Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.478823 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerID="10eb741e68dd4bf4b6e29484f179088872232d9b28d2beedd2bb4d8ccb463dc4" exitCode=0 Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.478948 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerDied","Data":"10eb741e68dd4bf4b6e29484f179088872232d9b28d2beedd2bb4d8ccb463dc4"} Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.479352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gvc" event={"ID":"ffffb3d3-b95a-4e56-8685-e32218a78bc4","Type":"ContainerDied","Data":"cb318942ced098d056f33c9a6b8180cac2056a8d9fcb89ffe9da063b52ca09fd"} Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.479376 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb318942ced098d056f33c9a6b8180cac2056a8d9fcb89ffe9da063b52ca09fd" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.534266 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.641204 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities\") pod \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.641432 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf2h7\" (UniqueName: \"kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7\") pod \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.641489 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content\") pod \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\" (UID: \"ffffb3d3-b95a-4e56-8685-e32218a78bc4\") " Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.642312 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities" (OuterVolumeSpecName: "utilities") pod "ffffb3d3-b95a-4e56-8685-e32218a78bc4" (UID: "ffffb3d3-b95a-4e56-8685-e32218a78bc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.647848 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7" (OuterVolumeSpecName: "kube-api-access-mf2h7") pod "ffffb3d3-b95a-4e56-8685-e32218a78bc4" (UID: "ffffb3d3-b95a-4e56-8685-e32218a78bc4"). InnerVolumeSpecName "kube-api-access-mf2h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.690077 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffffb3d3-b95a-4e56-8685-e32218a78bc4" (UID: "ffffb3d3-b95a-4e56-8685-e32218a78bc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.743860 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf2h7\" (UniqueName: \"kubernetes.io/projected/ffffb3d3-b95a-4e56-8685-e32218a78bc4-kube-api-access-mf2h7\") on node \"crc\" DevicePath \"\"" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.743900 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:36:53 crc kubenswrapper[4922]: I0929 10:36:53.743912 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffffb3d3-b95a-4e56-8685-e32218a78bc4-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:36:54 crc kubenswrapper[4922]: I0929 10:36:54.487454 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gvc" Sep 29 10:36:54 crc kubenswrapper[4922]: I0929 10:36:54.527443 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:54 crc kubenswrapper[4922]: I0929 10:36:54.536405 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8gvc"] Sep 29 10:36:55 crc kubenswrapper[4922]: I0929 10:36:55.466791 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" path="/var/lib/kubelet/pods/ffffb3d3-b95a-4e56-8685-e32218a78bc4/volumes" Sep 29 10:37:04 crc kubenswrapper[4922]: I0929 10:37:04.451943 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:37:04 crc kubenswrapper[4922]: E0929 10:37:04.452723 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:37:19 crc kubenswrapper[4922]: I0929 10:37:19.452141 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:37:19 crc kubenswrapper[4922]: E0929 10:37:19.454252 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:37:33 crc kubenswrapper[4922]: I0929 10:37:33.454181 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:37:33 crc kubenswrapper[4922]: E0929 10:37:33.457272 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:37:45 crc kubenswrapper[4922]: I0929 10:37:45.458096 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:37:45 crc kubenswrapper[4922]: E0929 10:37:45.459216 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:37:56 crc kubenswrapper[4922]: I0929 10:37:56.452040 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:37:56 crc kubenswrapper[4922]: E0929 10:37:56.453872 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:38:07 crc kubenswrapper[4922]: I0929 10:38:07.453497 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:38:08 crc kubenswrapper[4922]: I0929 10:38:08.191252 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6"} Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.625670 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:39:50 crc kubenswrapper[4922]: E0929 10:39:50.626890 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="registry-server" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.626914 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="registry-server" Sep 29 10:39:50 crc kubenswrapper[4922]: E0929 10:39:50.626936 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="extract-content" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.626947 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="extract-content" Sep 29 10:39:50 crc kubenswrapper[4922]: E0929 10:39:50.627000 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="extract-utilities" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.627012 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="extract-utilities" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.627356 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffffb3d3-b95a-4e56-8685-e32218a78bc4" containerName="registry-server" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.629747 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.636241 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.693021 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9rw\" (UniqueName: \"kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.693133 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.693355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.795526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.795676 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9rw\" (UniqueName: \"kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.795755 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.796440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.796717 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.824536 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9rw\" (UniqueName: \"kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw\") pod \"redhat-marketplace-cs6sn\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:50 crc kubenswrapper[4922]: I0929 10:39:50.971267 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:39:51 crc kubenswrapper[4922]: I0929 10:39:51.471054 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:39:52 crc kubenswrapper[4922]: I0929 10:39:52.179075 4922 generic.go:334] "Generic (PLEG): container finished" podID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerID="5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824" exitCode=0 Sep 29 10:39:52 crc kubenswrapper[4922]: I0929 10:39:52.179207 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerDied","Data":"5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824"} Sep 29 10:39:52 crc kubenswrapper[4922]: I0929 10:39:52.179528 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerStarted","Data":"fb1437d5b101ac32b861fed431f36878db4cd278ec380d4b4424dd447b1fa4fb"} Sep 29 10:39:53 crc kubenswrapper[4922]: I0929 10:39:53.198675 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerStarted","Data":"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809"} Sep 29 10:39:54 crc kubenswrapper[4922]: I0929 10:39:54.209748 4922 generic.go:334] "Generic (PLEG): container finished" podID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerID="0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809" exitCode=0 Sep 29 10:39:54 crc kubenswrapper[4922]: I0929 10:39:54.210630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerDied","Data":"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809"} Sep 29 10:39:55 crc kubenswrapper[4922]: I0929 10:39:55.226423 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerStarted","Data":"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706"} Sep 29 10:39:55 crc kubenswrapper[4922]: I0929 10:39:55.254771 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cs6sn" podStartSLOduration=2.811858743 podStartE2EDuration="5.254748454s" podCreationTimestamp="2025-09-29 10:39:50 +0000 UTC" firstStartedPulling="2025-09-29 10:39:52.180567499 +0000 UTC m=+3317.546797763" lastFinishedPulling="2025-09-29 10:39:54.62345721 +0000 UTC m=+3319.989687474" observedRunningTime="2025-09-29 10:39:55.244918403 +0000 UTC m=+3320.611148677" watchObservedRunningTime="2025-09-29 10:39:55.254748454 +0000 UTC m=+3320.620978718" Sep 29 10:40:00 crc kubenswrapper[4922]: I0929 10:40:00.971597 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:00 crc kubenswrapper[4922]: I0929 10:40:00.971999 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:01 crc kubenswrapper[4922]: I0929 10:40:01.035725 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:01 crc kubenswrapper[4922]: I0929 10:40:01.321571 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:01 crc kubenswrapper[4922]: I0929 10:40:01.375151 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.292086 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cs6sn" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="registry-server" containerID="cri-o://69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706" gracePeriod=2 Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.759029 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.869622 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities\") pod \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.870022 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9rw\" (UniqueName: \"kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw\") pod \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.870073 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content\") pod \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\" (UID: \"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed\") " Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.870591 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities" (OuterVolumeSpecName: "utilities") pod "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" (UID: "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.883518 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" (UID: "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.883657 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw" (OuterVolumeSpecName: "kube-api-access-8v9rw") pod "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" (UID: "9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed"). InnerVolumeSpecName "kube-api-access-8v9rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.971529 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9rw\" (UniqueName: \"kubernetes.io/projected/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-kube-api-access-8v9rw\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.971790 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:03 crc kubenswrapper[4922]: I0929 10:40:03.971901 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.303106 4922 generic.go:334] "Generic (PLEG): container finished" podID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerID="69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706" exitCode=0 Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.303188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerDied","Data":"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706"} Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.303176 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cs6sn" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.303236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cs6sn" event={"ID":"9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed","Type":"ContainerDied","Data":"fb1437d5b101ac32b861fed431f36878db4cd278ec380d4b4424dd447b1fa4fb"} Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.303520 4922 scope.go:117] "RemoveContainer" containerID="69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.325963 4922 scope.go:117] "RemoveContainer" containerID="0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.347547 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.350598 4922 scope.go:117] "RemoveContainer" containerID="5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.365306 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cs6sn"] Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.399989 4922 scope.go:117] "RemoveContainer" containerID="69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706" Sep 29 10:40:04 crc kubenswrapper[4922]: E0929 10:40:04.400550 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706\": container with ID starting with 69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706 not found: ID does not exist" containerID="69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.400670 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706"} err="failed to get container status \"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706\": rpc error: code = NotFound desc = could not find container \"69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706\": container with ID starting with 69d4b133c2abd56bd8fab14fa0936acc886d7ea2cdc48d6663a9804a8b279706 not found: ID does not exist" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.400775 4922 scope.go:117] "RemoveContainer" containerID="0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809" Sep 29 10:40:04 crc kubenswrapper[4922]: E0929 10:40:04.401315 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809\": container with ID starting with 0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809 not found: ID does not exist" containerID="0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.401363 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809"} err="failed to get container status \"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809\": rpc error: code = NotFound desc = could not find container \"0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809\": container with ID starting with 0e79eda6e5c8fa0f91b7deacb495f7133e2ce7848e8dc4d0d29349fa4af08809 not found: ID does not exist" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.401393 4922 scope.go:117] "RemoveContainer" containerID="5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824" Sep 29 10:40:04 crc kubenswrapper[4922]: E0929 10:40:04.401789 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824\": container with ID starting with 5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824 not found: ID does not exist" containerID="5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824" Sep 29 10:40:04 crc kubenswrapper[4922]: I0929 10:40:04.401826 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824"} err="failed to get container status \"5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824\": rpc error: code = NotFound desc = could not find container \"5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824\": container with ID starting with 5b9543338efd612c4eb6dd7604ae2f9ca9b26c5e69f8462ebf82fc1cd2ae7824 not found: ID does not exist" Sep 29 10:40:05 crc kubenswrapper[4922]: I0929 10:40:05.477345 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" path="/var/lib/kubelet/pods/9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed/volumes" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.589038 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:10 crc kubenswrapper[4922]: E0929 10:40:10.589962 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="extract-utilities" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.589974 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="extract-utilities" Sep 29 10:40:10 crc kubenswrapper[4922]: E0929 10:40:10.589993 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="extract-content" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.590002 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="extract-content" Sep 29 10:40:10 crc kubenswrapper[4922]: E0929 10:40:10.590022 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="registry-server" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.590028 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="registry-server" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.590239 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de4e82a-4d57-40dc-9ac5-fbbdbcfce3ed" containerName="registry-server" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.591846 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.610805 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.708094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7zh\" (UniqueName: \"kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.708257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.708570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.810726 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.810901 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7zh\" (UniqueName: \"kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.810948 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.811447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.811632 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.833440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7zh\" (UniqueName: \"kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh\") pod \"redhat-operators-qmqvl\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:10 crc kubenswrapper[4922]: I0929 10:40:10.930124 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:11 crc kubenswrapper[4922]: I0929 10:40:11.468441 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:12 crc kubenswrapper[4922]: I0929 10:40:12.397400 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerID="11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e" exitCode=0 Sep 29 10:40:12 crc kubenswrapper[4922]: I0929 10:40:12.397762 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerDied","Data":"11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e"} Sep 29 10:40:12 crc kubenswrapper[4922]: I0929 10:40:12.397789 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerStarted","Data":"98439db61d53aa8e4f68dfdd1b8652f1852040a8b11fa78f525631d3124fe5fd"} Sep 29 10:40:14 crc kubenswrapper[4922]: I0929 10:40:14.417504 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerStarted","Data":"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59"} Sep 29 10:40:15 crc kubenswrapper[4922]: I0929 10:40:15.430768 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerID="7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59" exitCode=0 Sep 29 10:40:15 crc kubenswrapper[4922]: I0929 10:40:15.430812 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerDied","Data":"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59"} Sep 29 10:40:16 crc kubenswrapper[4922]: I0929 10:40:16.444103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerStarted","Data":"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5"} Sep 29 10:40:16 crc kubenswrapper[4922]: I0929 10:40:16.476273 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qmqvl" podStartSLOduration=2.749458786 podStartE2EDuration="6.476246269s" podCreationTimestamp="2025-09-29 10:40:10 +0000 UTC" firstStartedPulling="2025-09-29 10:40:12.400265117 +0000 UTC m=+3337.766495381" lastFinishedPulling="2025-09-29 10:40:16.1270526 +0000 UTC m=+3341.493282864" observedRunningTime="2025-09-29 10:40:16.466471899 +0000 UTC m=+3341.832702183" watchObservedRunningTime="2025-09-29 10:40:16.476246269 +0000 UTC m=+3341.842476543" Sep 29 10:40:20 crc kubenswrapper[4922]: I0929 10:40:20.930296 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:20 crc kubenswrapper[4922]: I0929 10:40:20.930922 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:20 crc kubenswrapper[4922]: I0929 10:40:20.981322 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:21 crc kubenswrapper[4922]: I0929 10:40:21.550372 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:21 crc kubenswrapper[4922]: I0929 10:40:21.841221 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:23 crc kubenswrapper[4922]: I0929 10:40:23.504253 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qmqvl" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="registry-server" containerID="cri-o://c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5" gracePeriod=2 Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.012106 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.093064 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d7zh\" (UniqueName: \"kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh\") pod \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.093213 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities\") pod \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.093310 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content\") pod \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\" (UID: \"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41\") " Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.094116 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities" (OuterVolumeSpecName: "utilities") pod "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" (UID: "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.100061 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh" (OuterVolumeSpecName: "kube-api-access-4d7zh") pod "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" (UID: "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41"). InnerVolumeSpecName "kube-api-access-4d7zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.179641 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" (UID: "fdaaac6d-a4f3-47aa-b0f6-26f690f18c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.197695 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.197984 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.198057 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d7zh\" (UniqueName: \"kubernetes.io/projected/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41-kube-api-access-4d7zh\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.517797 4922 generic.go:334] "Generic (PLEG): container finished" podID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerID="c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5" exitCode=0 Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.517905 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerDied","Data":"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5"} Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.517982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmqvl" event={"ID":"fdaaac6d-a4f3-47aa-b0f6-26f690f18c41","Type":"ContainerDied","Data":"98439db61d53aa8e4f68dfdd1b8652f1852040a8b11fa78f525631d3124fe5fd"} Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.517981 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmqvl" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.518009 4922 scope.go:117] "RemoveContainer" containerID="c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.546223 4922 scope.go:117] "RemoveContainer" containerID="7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.565164 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.575654 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qmqvl"] Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.583919 4922 scope.go:117] "RemoveContainer" containerID="11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.625314 4922 scope.go:117] "RemoveContainer" containerID="c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5" Sep 29 10:40:24 crc kubenswrapper[4922]: E0929 10:40:24.626335 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5\": container with ID starting with c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5 not found: ID does not exist" containerID="c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.626423 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5"} err="failed to get container status \"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5\": rpc error: code = NotFound desc = could not find container \"c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5\": container with ID starting with c676570846d041ea178d844d4ecc204aa010d84ba82e5d9f0e6e5c385fa625d5 not found: ID does not exist" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.626474 4922 scope.go:117] "RemoveContainer" containerID="7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59" Sep 29 10:40:24 crc kubenswrapper[4922]: E0929 10:40:24.627143 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59\": container with ID starting with 7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59 not found: ID does not exist" containerID="7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.627194 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59"} err="failed to get container status \"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59\": rpc error: code = NotFound desc = could not find container \"7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59\": container with ID starting with 7c00ff2b15da394a66b72e785e1d187918bb78fc2faa0878fdd6c35aa08b0a59 not found: ID does not exist" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.627227 4922 scope.go:117] "RemoveContainer" containerID="11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e" Sep 29 10:40:24 crc kubenswrapper[4922]: E0929 10:40:24.627634 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e\": container with ID starting with 11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e not found: ID does not exist" containerID="11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e" Sep 29 10:40:24 crc kubenswrapper[4922]: I0929 10:40:24.627678 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e"} err="failed to get container status \"11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e\": rpc error: code = NotFound desc = could not find container \"11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e\": container with ID starting with 11c8eddc1be4bb8ce99728ae2427a4a140544f072eff71c550c5eb4ce1a0de8e not found: ID does not exist" Sep 29 10:40:25 crc kubenswrapper[4922]: I0929 10:40:25.465386 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" path="/var/lib/kubelet/pods/fdaaac6d-a4f3-47aa-b0f6-26f690f18c41/volumes" Sep 29 10:40:29 crc kubenswrapper[4922]: I0929 10:40:29.070781 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:40:29 crc kubenswrapper[4922]: I0929 10:40:29.071442 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.750243 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:37 crc kubenswrapper[4922]: E0929 10:40:37.751297 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="extract-content" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.751315 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="extract-content" Sep 29 10:40:37 crc kubenswrapper[4922]: E0929 10:40:37.751336 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="registry-server" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.751344 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="registry-server" Sep 29 10:40:37 crc kubenswrapper[4922]: E0929 10:40:37.751363 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="extract-utilities" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.751371 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="extract-utilities" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.751615 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdaaac6d-a4f3-47aa-b0f6-26f690f18c41" containerName="registry-server" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.754319 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.771973 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.861795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79qw\" (UniqueName: \"kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.861940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.862019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.963999 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.964140 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79qw\" (UniqueName: \"kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.964584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.964622 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.965079 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:37 crc kubenswrapper[4922]: I0929 10:40:37.993765 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79qw\" (UniqueName: \"kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw\") pod \"certified-operators-j64nx\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:38 crc kubenswrapper[4922]: I0929 10:40:38.074285 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:38 crc kubenswrapper[4922]: I0929 10:40:38.574818 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:38 crc kubenswrapper[4922]: I0929 10:40:38.663951 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerStarted","Data":"554ccde4d2ed307fb169cf2caddd0abf91c6db052e0e9ab258d414a5f7272b12"} Sep 29 10:40:39 crc kubenswrapper[4922]: I0929 10:40:39.677393 4922 generic.go:334] "Generic (PLEG): container finished" podID="5da79e97-2cbf-404f-99ac-e285c7767162" containerID="83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8" exitCode=0 Sep 29 10:40:39 crc kubenswrapper[4922]: I0929 10:40:39.677459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerDied","Data":"83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8"} Sep 29 10:40:41 crc kubenswrapper[4922]: I0929 10:40:41.701499 4922 generic.go:334] "Generic (PLEG): container finished" podID="5da79e97-2cbf-404f-99ac-e285c7767162" containerID="8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460" exitCode=0 Sep 29 10:40:41 crc kubenswrapper[4922]: I0929 10:40:41.702178 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerDied","Data":"8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460"} Sep 29 10:40:42 crc kubenswrapper[4922]: I0929 10:40:42.715992 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerStarted","Data":"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08"} Sep 29 10:40:42 crc kubenswrapper[4922]: I0929 10:40:42.742393 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j64nx" podStartSLOduration=2.966399858 podStartE2EDuration="5.742361698s" podCreationTimestamp="2025-09-29 10:40:37 +0000 UTC" firstStartedPulling="2025-09-29 10:40:39.680086876 +0000 UTC m=+3365.046317160" lastFinishedPulling="2025-09-29 10:40:42.456048736 +0000 UTC m=+3367.822279000" observedRunningTime="2025-09-29 10:40:42.731519792 +0000 UTC m=+3368.097750076" watchObservedRunningTime="2025-09-29 10:40:42.742361698 +0000 UTC m=+3368.108591972" Sep 29 10:40:48 crc kubenswrapper[4922]: I0929 10:40:48.074910 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:48 crc kubenswrapper[4922]: I0929 10:40:48.075495 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:48 crc kubenswrapper[4922]: I0929 10:40:48.128196 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:48 crc kubenswrapper[4922]: I0929 10:40:48.811408 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:48 crc kubenswrapper[4922]: I0929 10:40:48.849665 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:50 crc kubenswrapper[4922]: I0929 10:40:50.786036 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j64nx" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="registry-server" containerID="cri-o://474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08" gracePeriod=2 Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.291328 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.418188 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q79qw\" (UniqueName: \"kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw\") pod \"5da79e97-2cbf-404f-99ac-e285c7767162\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.418243 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities\") pod \"5da79e97-2cbf-404f-99ac-e285c7767162\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.418264 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content\") pod \"5da79e97-2cbf-404f-99ac-e285c7767162\" (UID: \"5da79e97-2cbf-404f-99ac-e285c7767162\") " Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.419174 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities" (OuterVolumeSpecName: "utilities") pod "5da79e97-2cbf-404f-99ac-e285c7767162" (UID: "5da79e97-2cbf-404f-99ac-e285c7767162"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.427964 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw" (OuterVolumeSpecName: "kube-api-access-q79qw") pod "5da79e97-2cbf-404f-99ac-e285c7767162" (UID: "5da79e97-2cbf-404f-99ac-e285c7767162"). InnerVolumeSpecName "kube-api-access-q79qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.480196 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5da79e97-2cbf-404f-99ac-e285c7767162" (UID: "5da79e97-2cbf-404f-99ac-e285c7767162"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.521521 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.521598 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da79e97-2cbf-404f-99ac-e285c7767162-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.521617 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q79qw\" (UniqueName: \"kubernetes.io/projected/5da79e97-2cbf-404f-99ac-e285c7767162-kube-api-access-q79qw\") on node \"crc\" DevicePath \"\"" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.797192 4922 generic.go:334] "Generic (PLEG): container finished" podID="5da79e97-2cbf-404f-99ac-e285c7767162" containerID="474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08" exitCode=0 Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.797277 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerDied","Data":"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08"} Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.797289 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j64nx" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.797992 4922 scope.go:117] "RemoveContainer" containerID="474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.797952 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j64nx" event={"ID":"5da79e97-2cbf-404f-99ac-e285c7767162","Type":"ContainerDied","Data":"554ccde4d2ed307fb169cf2caddd0abf91c6db052e0e9ab258d414a5f7272b12"} Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.820382 4922 scope.go:117] "RemoveContainer" containerID="8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.837180 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.846921 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j64nx"] Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.862469 4922 scope.go:117] "RemoveContainer" containerID="83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.904860 4922 scope.go:117] "RemoveContainer" containerID="474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08" Sep 29 10:40:51 crc kubenswrapper[4922]: E0929 10:40:51.905710 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08\": container with ID starting with 474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08 not found: ID does not exist" containerID="474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.905767 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08"} err="failed to get container status \"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08\": rpc error: code = NotFound desc = could not find container \"474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08\": container with ID starting with 474971e800ff26f06c3fe2b912169a3b102ff8a217ffb56489b12215d9913b08 not found: ID does not exist" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.905811 4922 scope.go:117] "RemoveContainer" containerID="8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460" Sep 29 10:40:51 crc kubenswrapper[4922]: E0929 10:40:51.906312 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460\": container with ID starting with 8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460 not found: ID does not exist" containerID="8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.906345 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460"} err="failed to get container status \"8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460\": rpc error: code = NotFound desc = could not find container \"8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460\": container with ID starting with 8e913137dce7241173970f6a8cdc97bc4d4dbd4e016831b5bfd145b5e5d8c460 not found: ID does not exist" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.906361 4922 scope.go:117] "RemoveContainer" containerID="83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8" Sep 29 10:40:51 crc kubenswrapper[4922]: E0929 10:40:51.906806 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8\": container with ID starting with 83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8 not found: ID does not exist" containerID="83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8" Sep 29 10:40:51 crc kubenswrapper[4922]: I0929 10:40:51.906859 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8"} err="failed to get container status \"83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8\": rpc error: code = NotFound desc = could not find container \"83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8\": container with ID starting with 83f8f56258ad0c0cae4031b543985f8dc9e473a13e000f6251e90e771585a3e8 not found: ID does not exist" Sep 29 10:40:53 crc kubenswrapper[4922]: I0929 10:40:53.474720 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" path="/var/lib/kubelet/pods/5da79e97-2cbf-404f-99ac-e285c7767162/volumes" Sep 29 10:40:59 crc kubenswrapper[4922]: I0929 10:40:59.070858 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:40:59 crc kubenswrapper[4922]: I0929 10:40:59.071455 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:41:29 crc kubenswrapper[4922]: I0929 10:41:29.070392 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:41:29 crc kubenswrapper[4922]: I0929 10:41:29.071179 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:41:29 crc kubenswrapper[4922]: I0929 10:41:29.071251 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:41:29 crc kubenswrapper[4922]: I0929 10:41:29.072589 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:41:29 crc kubenswrapper[4922]: I0929 10:41:29.072714 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6" gracePeriod=600 Sep 29 10:41:30 crc kubenswrapper[4922]: I0929 10:41:30.175698 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6" exitCode=0 Sep 29 10:41:30 crc kubenswrapper[4922]: I0929 10:41:30.175778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6"} Sep 29 10:41:30 crc kubenswrapper[4922]: I0929 10:41:30.176684 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401"} Sep 29 10:41:30 crc kubenswrapper[4922]: I0929 10:41:30.176709 4922 scope.go:117] "RemoveContainer" containerID="3dc8f959d88c4350e375ccd53e0e684d86ea46542256a5452d19833a4658b530" Sep 29 10:42:22 crc kubenswrapper[4922]: I0929 10:42:22.667628 4922 generic.go:334] "Generic (PLEG): container finished" podID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" containerID="e3e40029d4e3ab2cf562fc4432270a3d3a7655bfc09b70c5f3a8235848974c56" exitCode=0 Sep 29 10:42:22 crc kubenswrapper[4922]: I0929 10:42:22.667731 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab","Type":"ContainerDied","Data":"e3e40029d4e3ab2cf562fc4432270a3d3a7655bfc09b70c5f3a8235848974c56"} Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.037851 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154020 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154181 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154390 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154502 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154525 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154568 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154589 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.154625 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vfmb\" (UniqueName: \"kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb\") pod \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\" (UID: \"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab\") " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.155242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.155458 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data" (OuterVolumeSpecName: "config-data") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.156438 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.156474 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.158257 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.167999 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.168442 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb" (OuterVolumeSpecName: "kube-api-access-8vfmb") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "kube-api-access-8vfmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.187098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.191540 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.199658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.210051 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" (UID: "b34ceaf2-30f5-4be7-8806-fad8a2bd21ab"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258652 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258691 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258706 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258719 4922 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258761 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258773 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.258784 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vfmb\" (UniqueName: \"kubernetes.io/projected/b34ceaf2-30f5-4be7-8806-fad8a2bd21ab-kube-api-access-8vfmb\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.279243 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.361595 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.690706 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b34ceaf2-30f5-4be7-8806-fad8a2bd21ab","Type":"ContainerDied","Data":"d36e5fa74e7ff5b1cf4c194dceff70e5671af408407c46679581ab7105d5f1f3"} Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.691157 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d36e5fa74e7ff5b1cf4c194dceff70e5671af408407c46679581ab7105d5f1f3" Sep 29 10:42:24 crc kubenswrapper[4922]: I0929 10:42:24.690812 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.039211 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:42:32 crc kubenswrapper[4922]: E0929 10:42:32.039954 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="registry-server" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.039969 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="registry-server" Sep 29 10:42:32 crc kubenswrapper[4922]: E0929 10:42:32.039992 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="extract-utilities" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040003 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="extract-utilities" Sep 29 10:42:32 crc kubenswrapper[4922]: E0929 10:42:32.040036 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040045 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:42:32 crc kubenswrapper[4922]: E0929 10:42:32.040061 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="extract-content" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040070 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="extract-content" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040271 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da79e97-2cbf-404f-99ac-e285c7767162" containerName="registry-server" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040286 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ceaf2-30f5-4be7-8806-fad8a2bd21ab" containerName="tempest-tests-tempest-tests-runner" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.040945 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.043191 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7l6b4" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.047455 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.124097 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.124158 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjfw\" (UniqueName: \"kubernetes.io/projected/03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df-kube-api-access-qzjfw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.225450 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.225533 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjfw\" (UniqueName: \"kubernetes.io/projected/03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df-kube-api-access-qzjfw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.226029 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.254908 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjfw\" (UniqueName: \"kubernetes.io/projected/03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df-kube-api-access-qzjfw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.258938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.368401 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.827104 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 29 10:42:32 crc kubenswrapper[4922]: I0929 10:42:32.833606 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:42:33 crc kubenswrapper[4922]: I0929 10:42:33.773490 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df","Type":"ContainerStarted","Data":"a33da98cba78835b66f5d7b4b1012d1d756f4aa9225da3a8d0fe56df304786a6"} Sep 29 10:42:34 crc kubenswrapper[4922]: I0929 10:42:34.789977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df","Type":"ContainerStarted","Data":"06f78e75f1742e175923143f36f0c32e21343d6d268594deb97bf2b08636dcbb"} Sep 29 10:42:34 crc kubenswrapper[4922]: I0929 10:42:34.809650 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.942355267 podStartE2EDuration="2.809631807s" podCreationTimestamp="2025-09-29 10:42:32 +0000 UTC" firstStartedPulling="2025-09-29 10:42:32.833382173 +0000 UTC m=+3478.199612437" lastFinishedPulling="2025-09-29 10:42:33.700658713 +0000 UTC m=+3479.066888977" observedRunningTime="2025-09-29 10:42:34.803313316 +0000 UTC m=+3480.169543600" watchObservedRunningTime="2025-09-29 10:42:34.809631807 +0000 UTC m=+3480.175862071" Sep 29 10:42:52 crc kubenswrapper[4922]: I0929 10:42:52.115370 4922 scope.go:117] "RemoveContainer" containerID="c7bf540db0462c915aa3c2f1831bc13f8d570a9fa02dde8c13a89a841d7df099" Sep 29 10:42:52 crc kubenswrapper[4922]: I0929 10:42:52.151217 4922 scope.go:117] "RemoveContainer" containerID="10eb741e68dd4bf4b6e29484f179088872232d9b28d2beedd2bb4d8ccb463dc4" Sep 29 10:42:52 crc kubenswrapper[4922]: I0929 10:42:52.189690 4922 scope.go:117] "RemoveContainer" containerID="ae9a750c8e2017d68c05a33ee48051c9a9e886de0db8be68880c12b6daed4224" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.002748 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l9464/must-gather-rprqs"] Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.004965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.007120 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-l9464"/"default-dockercfg-2mmzr" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.007338 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l9464"/"kube-root-ca.crt" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.012047 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l9464/must-gather-rprqs"] Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.018650 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l9464"/"openshift-service-ca.crt" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.123194 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsrn\" (UniqueName: \"kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.123356 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.225299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsrn\" (UniqueName: \"kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.225403 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.225791 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.247892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsrn\" (UniqueName: \"kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn\") pod \"must-gather-rprqs\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.324350 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.793172 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l9464/must-gather-rprqs"] Sep 29 10:42:53 crc kubenswrapper[4922]: I0929 10:42:53.959020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/must-gather-rprqs" event={"ID":"2a4eb407-0186-4d8e-8df5-e301333ff978","Type":"ContainerStarted","Data":"227db4f83519a99ab32af0f078c1f499ef3eb25e75062f69309a6c092a2d0f17"} Sep 29 10:42:59 crc kubenswrapper[4922]: I0929 10:42:59.001986 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/must-gather-rprqs" event={"ID":"2a4eb407-0186-4d8e-8df5-e301333ff978","Type":"ContainerStarted","Data":"73487934a7c4e7c71344da42daed43b1ef4b431438c9bab85dac3a30c4e858a3"} Sep 29 10:42:59 crc kubenswrapper[4922]: I0929 10:42:59.002572 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/must-gather-rprqs" event={"ID":"2a4eb407-0186-4d8e-8df5-e301333ff978","Type":"ContainerStarted","Data":"651b84d4d0846c5ca6f5535eada2c3a66f380d007edbb0f9fb82fdcfb762da52"} Sep 29 10:42:59 crc kubenswrapper[4922]: I0929 10:42:59.024072 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l9464/must-gather-rprqs" podStartSLOduration=3.11444929 podStartE2EDuration="7.024053503s" podCreationTimestamp="2025-09-29 10:42:52 +0000 UTC" firstStartedPulling="2025-09-29 10:42:53.817991288 +0000 UTC m=+3499.184221552" lastFinishedPulling="2025-09-29 10:42:57.727595501 +0000 UTC m=+3503.093825765" observedRunningTime="2025-09-29 10:42:59.013714548 +0000 UTC m=+3504.379944812" watchObservedRunningTime="2025-09-29 10:42:59.024053503 +0000 UTC m=+3504.390283767" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.522272 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l9464/crc-debug-dblxr"] Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.524422 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.685880 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.685937 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkrp\" (UniqueName: \"kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.787785 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.787862 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkrp\" (UniqueName: \"kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.788323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.810656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkrp\" (UniqueName: \"kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp\") pod \"crc-debug-dblxr\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: I0929 10:43:01.850279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:43:01 crc kubenswrapper[4922]: W0929 10:43:01.884656 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd67cb2_d0ba_4420_be0b_a49ea8817669.slice/crio-78d9ed203b7d6becda58a39674bc31fd83e8f4d365ed794a7ac45649c9966741 WatchSource:0}: Error finding container 78d9ed203b7d6becda58a39674bc31fd83e8f4d365ed794a7ac45649c9966741: Status 404 returned error can't find the container with id 78d9ed203b7d6becda58a39674bc31fd83e8f4d365ed794a7ac45649c9966741 Sep 29 10:43:02 crc kubenswrapper[4922]: I0929 10:43:02.030034 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-dblxr" event={"ID":"4bd67cb2-d0ba-4420-be0b-a49ea8817669","Type":"ContainerStarted","Data":"78d9ed203b7d6becda58a39674bc31fd83e8f4d365ed794a7ac45649c9966741"} Sep 29 10:43:14 crc kubenswrapper[4922]: I0929 10:43:14.150161 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-dblxr" event={"ID":"4bd67cb2-d0ba-4420-be0b-a49ea8817669","Type":"ContainerStarted","Data":"955dfc09f81346f263009decf4b3fbd3f4fde4a78c4e559d620a34cb8351f457"} Sep 29 10:43:14 crc kubenswrapper[4922]: I0929 10:43:14.169938 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l9464/crc-debug-dblxr" podStartSLOduration=2.116454824 podStartE2EDuration="13.169910355s" podCreationTimestamp="2025-09-29 10:43:01 +0000 UTC" firstStartedPulling="2025-09-29 10:43:01.888008769 +0000 UTC m=+3507.254239033" lastFinishedPulling="2025-09-29 10:43:12.9414643 +0000 UTC m=+3518.307694564" observedRunningTime="2025-09-29 10:43:14.165990215 +0000 UTC m=+3519.532220479" watchObservedRunningTime="2025-09-29 10:43:14.169910355 +0000 UTC m=+3519.536140619" Sep 29 10:43:29 crc kubenswrapper[4922]: I0929 10:43:29.070589 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:43:29 crc kubenswrapper[4922]: I0929 10:43:29.071233 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:43:59 crc kubenswrapper[4922]: I0929 10:43:59.070722 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:43:59 crc kubenswrapper[4922]: I0929 10:43:59.071452 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:44:00 crc kubenswrapper[4922]: I0929 10:44:00.827482 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9c9d67f9d-vs7st_c3007912-e64d-4325-beb8-fa3c2dfcbe5e/barbican-api/0.log" Sep 29 10:44:00 crc kubenswrapper[4922]: I0929 10:44:00.846370 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9c9d67f9d-vs7st_c3007912-e64d-4325-beb8-fa3c2dfcbe5e/barbican-api-log/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.055577 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cdd7877d-rvfhb_541f048f-4db6-45d6-aaa2-659dc9ff0b86/barbican-keystone-listener/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.100179 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cdd7877d-rvfhb_541f048f-4db6-45d6-aaa2-659dc9ff0b86/barbican-keystone-listener-log/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.288122 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59bb77879f-bdcc9_f0d2cc2a-cdf2-490c-a56b-48977a5d83e0/barbican-worker/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.303118 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59bb77879f-bdcc9_f0d2cc2a-cdf2-490c-a56b-48977a5d83e0/barbican-worker-log/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.513288 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z_9c5d1232-a030-44f4-823e-5c806d5dd896/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.756345 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/ceilometer-central-agent/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.771909 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/ceilometer-notification-agent/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.805815 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/proxy-httpd/0.log" Sep 29 10:44:01 crc kubenswrapper[4922]: I0929 10:44:01.951576 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/sg-core/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.040114 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_34ed6be9-1694-4866-a437-36f08027b85f/cinder-api/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.194619 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_34ed6be9-1694-4866-a437-36f08027b85f/cinder-api-log/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.290623 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9172c63-e9e9-43c9-a804-72410f85eefe/cinder-scheduler/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.429463 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9172c63-e9e9-43c9-a804-72410f85eefe/probe/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.534943 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9_b7891137-eddd-4865-9a35-f32a72a1f206/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.715800 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc_8653f711-4f91-4ce3-a900-95aa54ac26a1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:02 crc kubenswrapper[4922]: I0929 10:44:02.930706 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/init/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.123045 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/init/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.167094 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/dnsmasq-dns/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.344607 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gngd5_3cbc70f7-2707-430a-a8d1-d33aee8c7ae8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.420309 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4da22caf-781b-42ef-ad66-521d0908aabb/glance-httpd/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.582363 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4da22caf-781b-42ef-ad66-521d0908aabb/glance-log/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.687026 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5ec5073f-9a07-4292-8cfb-62e419a0438d/glance-httpd/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.857602 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5ec5073f-9a07-4292-8cfb-62e419a0438d/glance-log/0.log" Sep 29 10:44:03 crc kubenswrapper[4922]: I0929 10:44:03.998577 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58b957f588-sp2bt_84f21d67-d595-4458-871c-e4bbc362b134/horizon/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.163839 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5sq24_d69265aa-752a-4d25-9af4-6dd389d13e8a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.464317 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7wdvn_425016bd-6178-497e-ad2b-e150d1cf141f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.500022 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58b957f588-sp2bt_84f21d67-d595-4458-871c-e4bbc362b134/horizon-log/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.708209 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79f578d789-bbw9r_858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99/keystone-api/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.742972 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8000729b-19d9-47cd-baa5-7ee4bab9cc04/kube-state-metrics/0.log" Sep 29 10:44:04 crc kubenswrapper[4922]: I0929 10:44:04.953033 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vvllg_bcdc9bf2-2da5-4261-89b5-dd6111d25d3b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:05 crc kubenswrapper[4922]: I0929 10:44:05.350425 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8b5fcf5f9-p74mm_59b8f377-8449-49f6-992b-6b76ef613283/neutron-httpd/0.log" Sep 29 10:44:05 crc kubenswrapper[4922]: I0929 10:44:05.368298 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8b5fcf5f9-p74mm_59b8f377-8449-49f6-992b-6b76ef613283/neutron-api/0.log" Sep 29 10:44:05 crc kubenswrapper[4922]: I0929 10:44:05.582229 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff_f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:06 crc kubenswrapper[4922]: I0929 10:44:06.226199 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18e1ff02-c0b1-4095-a40d-b9dc5a492de4/nova-api-log/0.log" Sep 29 10:44:06 crc kubenswrapper[4922]: I0929 10:44:06.341854 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e1139d93-2038-4fa3-b31c-1e7ddedd0bb7/nova-cell0-conductor-conductor/0.log" Sep 29 10:44:06 crc kubenswrapper[4922]: I0929 10:44:06.360781 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18e1ff02-c0b1-4095-a40d-b9dc5a492de4/nova-api-api/0.log" Sep 29 10:44:06 crc kubenswrapper[4922]: I0929 10:44:06.725763 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4d6968da-188d-461e-ab0f-00bf3e2fab0c/nova-cell1-conductor-conductor/0.log" Sep 29 10:44:06 crc kubenswrapper[4922]: I0929 10:44:06.869325 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528/nova-cell1-novncproxy-novncproxy/0.log" Sep 29 10:44:07 crc kubenswrapper[4922]: I0929 10:44:07.010242 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-575vl_39297e68-ef0c-4e52-922d-28805d4a7171/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:07 crc kubenswrapper[4922]: I0929 10:44:07.197301 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6e84e53c-d007-4780-be8e-1794d0c7b88f/nova-metadata-log/0.log" Sep 29 10:44:07 crc kubenswrapper[4922]: I0929 10:44:07.655726 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ee836f5f-3a1b-4c14-9234-711246af0b41/nova-scheduler-scheduler/0.log" Sep 29 10:44:07 crc kubenswrapper[4922]: I0929 10:44:07.951949 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/mysql-bootstrap/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.097974 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/mysql-bootstrap/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.200061 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/galera/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.450114 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/mysql-bootstrap/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.641558 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/mysql-bootstrap/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.674293 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/galera/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.684319 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6e84e53c-d007-4780-be8e-1794d0c7b88f/nova-metadata-metadata/0.log" Sep 29 10:44:08 crc kubenswrapper[4922]: I0929 10:44:08.885730 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c6091d68-5a19-44af-8ffb-ec05b516a160/openstackclient/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.154080 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6kqsg_12d2ae39-f918-485b-a8c4-b083cdf9d48f/ovn-controller/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.213219 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cgtjg_cf954c93-5942-433b-bbb7-6f0737969eb5/openstack-network-exporter/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.440968 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server-init/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.680380 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovs-vswitchd/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.714142 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.725596 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server-init/0.log" Sep 29 10:44:09 crc kubenswrapper[4922]: I0929 10:44:09.987204 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fz6hz_4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.166446 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6b7e7da6-14cb-4046-b71d-8039326ca601/openstack-network-exporter/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.260628 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6b7e7da6-14cb-4046-b71d-8039326ca601/ovn-northd/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.422901 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d76a9416-91c8-4df6-b6a4-898c4df4ac1a/openstack-network-exporter/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.513646 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d76a9416-91c8-4df6-b6a4-898c4df4ac1a/ovsdbserver-nb/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.621656 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d267e81e-9044-4619-b2f2-4c370674a31c/openstack-network-exporter/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.759688 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d267e81e-9044-4619-b2f2-4c370674a31c/ovsdbserver-sb/0.log" Sep 29 10:44:10 crc kubenswrapper[4922]: I0929 10:44:10.899612 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fc9c79bdd-vqp6p_aceaf0a2-2b2b-4ef9-99d1-8bd21f553634/placement-api/0.log" Sep 29 10:44:11 crc kubenswrapper[4922]: I0929 10:44:11.374098 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fc9c79bdd-vqp6p_aceaf0a2-2b2b-4ef9-99d1-8bd21f553634/placement-log/0.log" Sep 29 10:44:11 crc kubenswrapper[4922]: I0929 10:44:11.433668 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/setup-container/0.log" Sep 29 10:44:11 crc kubenswrapper[4922]: I0929 10:44:11.675412 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/setup-container/0.log" Sep 29 10:44:11 crc kubenswrapper[4922]: I0929 10:44:11.679147 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/rabbitmq/0.log" Sep 29 10:44:11 crc kubenswrapper[4922]: I0929 10:44:11.895844 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/setup-container/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.103783 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/setup-container/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.148573 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/rabbitmq/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.360089 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr_2bb4b88d-fc96-488b-a144-7f524d2cd1e7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.436319 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2rqds_782111a0-a54f-49fa-a519-e0d3a68e9cbf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.679462 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t_7d952f02-09db-44fd-ae8b-6b2c8ea06505/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:12 crc kubenswrapper[4922]: I0929 10:44:12.884650 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kfbbn_3ae2127d-25fd-4296-9143-1f12b7ffd0c2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.043329 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ntxlw_b579a838-93e1-47ac-8069-b49e76d8d630/ssh-known-hosts-edpm-deployment/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.300045 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f74f76895-9f28s_1b044ac1-a144-454a-a2f7-bf438ba13cc0/proxy-httpd/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.322268 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f74f76895-9f28s_1b044ac1-a144-454a-a2f7-bf438ba13cc0/proxy-server/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.497076 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tvgqs_396dcf64-c14b-4e56-9533-dbadbfac272a/swift-ring-rebalance/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.617464 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-auditor/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.768561 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-reaper/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.897347 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-server/0.log" Sep 29 10:44:13 crc kubenswrapper[4922]: I0929 10:44:13.917401 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-replicator/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.000791 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-auditor/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.147773 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-server/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.177506 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-replicator/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.255505 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-updater/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.474293 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-auditor/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.482723 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-expirer/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.593598 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-replicator/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.710007 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-server/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.763618 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-updater/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.890587 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/rsync/0.log" Sep 29 10:44:14 crc kubenswrapper[4922]: I0929 10:44:14.925040 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/swift-recon-cron/0.log" Sep 29 10:44:15 crc kubenswrapper[4922]: I0929 10:44:15.251930 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dd76x_a810e32e-1655-40f8-b445-9922b0d5603f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:15 crc kubenswrapper[4922]: I0929 10:44:15.386377 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b34ceaf2-30f5-4be7-8806-fad8a2bd21ab/tempest-tests-tempest-tests-runner/0.log" Sep 29 10:44:15 crc kubenswrapper[4922]: I0929 10:44:15.498184 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df/test-operator-logs-container/0.log" Sep 29 10:44:15 crc kubenswrapper[4922]: I0929 10:44:15.721466 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn_6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:44:25 crc kubenswrapper[4922]: I0929 10:44:25.778143 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c/memcached/0.log" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.070481 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.071134 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.071187 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.072975 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.073087 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" gracePeriod=600 Sep 29 10:44:29 crc kubenswrapper[4922]: E0929 10:44:29.837682 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.892552 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" exitCode=0 Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.892598 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401"} Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.892633 4922 scope.go:117] "RemoveContainer" containerID="8a9e40e65a014756644053b12ffe5e16feda853e5105ed46b8827c7da0ecf1b6" Sep 29 10:44:29 crc kubenswrapper[4922]: I0929 10:44:29.893362 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:44:29 crc kubenswrapper[4922]: E0929 10:44:29.893709 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:44:42 crc kubenswrapper[4922]: I0929 10:44:42.452237 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:44:42 crc kubenswrapper[4922]: E0929 10:44:42.452872 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:44:57 crc kubenswrapper[4922]: I0929 10:44:57.452894 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:44:57 crc kubenswrapper[4922]: E0929 10:44:57.453811 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.206689 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx"] Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.208547 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.213298 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.213428 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.221094 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx"] Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.388064 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq8kh\" (UniqueName: \"kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.388161 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.388184 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.490688 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq8kh\" (UniqueName: \"kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.490791 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.490822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.492732 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.497882 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.510405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq8kh\" (UniqueName: \"kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh\") pod \"collect-profiles-29319045-sghlx\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:00 crc kubenswrapper[4922]: I0929 10:45:00.535141 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:01 crc kubenswrapper[4922]: I0929 10:45:01.009214 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx"] Sep 29 10:45:01 crc kubenswrapper[4922]: W0929 10:45:01.029242 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269731ec_c964_44ec_9167_9ced86b207a5.slice/crio-08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f WatchSource:0}: Error finding container 08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f: Status 404 returned error can't find the container with id 08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f Sep 29 10:45:01 crc kubenswrapper[4922]: I0929 10:45:01.189712 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" event={"ID":"269731ec-c964-44ec-9167-9ced86b207a5","Type":"ContainerStarted","Data":"08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f"} Sep 29 10:45:02 crc kubenswrapper[4922]: I0929 10:45:02.200009 4922 generic.go:334] "Generic (PLEG): container finished" podID="269731ec-c964-44ec-9167-9ced86b207a5" containerID="421ee437918a6e36a63385f5b07cf1e0ceb1bf9011c810d3764c50ac8dfcaa33" exitCode=0 Sep 29 10:45:02 crc kubenswrapper[4922]: I0929 10:45:02.200102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" event={"ID":"269731ec-c964-44ec-9167-9ced86b207a5","Type":"ContainerDied","Data":"421ee437918a6e36a63385f5b07cf1e0ceb1bf9011c810d3764c50ac8dfcaa33"} Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.545928 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.664136 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq8kh\" (UniqueName: \"kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh\") pod \"269731ec-c964-44ec-9167-9ced86b207a5\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.664318 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume\") pod \"269731ec-c964-44ec-9167-9ced86b207a5\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.664404 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume\") pod \"269731ec-c964-44ec-9167-9ced86b207a5\" (UID: \"269731ec-c964-44ec-9167-9ced86b207a5\") " Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.664986 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "269731ec-c964-44ec-9167-9ced86b207a5" (UID: "269731ec-c964-44ec-9167-9ced86b207a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.677426 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh" (OuterVolumeSpecName: "kube-api-access-wq8kh") pod "269731ec-c964-44ec-9167-9ced86b207a5" (UID: "269731ec-c964-44ec-9167-9ced86b207a5"). InnerVolumeSpecName "kube-api-access-wq8kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.677946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "269731ec-c964-44ec-9167-9ced86b207a5" (UID: "269731ec-c964-44ec-9167-9ced86b207a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.766421 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269731ec-c964-44ec-9167-9ced86b207a5-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.766467 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269731ec-c964-44ec-9167-9ced86b207a5-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:03 crc kubenswrapper[4922]: I0929 10:45:03.766482 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq8kh\" (UniqueName: \"kubernetes.io/projected/269731ec-c964-44ec-9167-9ced86b207a5-kube-api-access-wq8kh\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:04 crc kubenswrapper[4922]: I0929 10:45:04.218522 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" event={"ID":"269731ec-c964-44ec-9167-9ced86b207a5","Type":"ContainerDied","Data":"08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f"} Sep 29 10:45:04 crc kubenswrapper[4922]: I0929 10:45:04.218566 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cbcdf4a1af996d81765deced4118c9a5dad86215acc44e1b96f6ba881ca39f" Sep 29 10:45:04 crc kubenswrapper[4922]: I0929 10:45:04.218604 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319045-sghlx" Sep 29 10:45:04 crc kubenswrapper[4922]: I0929 10:45:04.623444 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9"] Sep 29 10:45:04 crc kubenswrapper[4922]: I0929 10:45:04.631939 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319000-5vgv9"] Sep 29 10:45:05 crc kubenswrapper[4922]: I0929 10:45:05.468027 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae87e51-bfce-41e2-b41a-327df982e7aa" path="/var/lib/kubelet/pods/6ae87e51-bfce-41e2-b41a-327df982e7aa/volumes" Sep 29 10:45:08 crc kubenswrapper[4922]: I0929 10:45:08.451549 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:45:08 crc kubenswrapper[4922]: E0929 10:45:08.453259 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:45:18 crc kubenswrapper[4922]: I0929 10:45:18.344507 4922 generic.go:334] "Generic (PLEG): container finished" podID="4bd67cb2-d0ba-4420-be0b-a49ea8817669" containerID="955dfc09f81346f263009decf4b3fbd3f4fde4a78c4e559d620a34cb8351f457" exitCode=0 Sep 29 10:45:18 crc kubenswrapper[4922]: I0929 10:45:18.344672 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-dblxr" event={"ID":"4bd67cb2-d0ba-4420-be0b-a49ea8817669","Type":"ContainerDied","Data":"955dfc09f81346f263009decf4b3fbd3f4fde4a78c4e559d620a34cb8351f457"} Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.451702 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:45:19 crc kubenswrapper[4922]: E0929 10:45:19.452235 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.470979 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.510951 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l9464/crc-debug-dblxr"] Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.523262 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l9464/crc-debug-dblxr"] Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.594277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host\") pod \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.594552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjkrp\" (UniqueName: \"kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp\") pod \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\" (UID: \"4bd67cb2-d0ba-4420-be0b-a49ea8817669\") " Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.594728 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host" (OuterVolumeSpecName: "host") pod "4bd67cb2-d0ba-4420-be0b-a49ea8817669" (UID: "4bd67cb2-d0ba-4420-be0b-a49ea8817669"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.595460 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4bd67cb2-d0ba-4420-be0b-a49ea8817669-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.603368 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp" (OuterVolumeSpecName: "kube-api-access-gjkrp") pod "4bd67cb2-d0ba-4420-be0b-a49ea8817669" (UID: "4bd67cb2-d0ba-4420-be0b-a49ea8817669"). InnerVolumeSpecName "kube-api-access-gjkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:19 crc kubenswrapper[4922]: I0929 10:45:19.700418 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjkrp\" (UniqueName: \"kubernetes.io/projected/4bd67cb2-d0ba-4420-be0b-a49ea8817669-kube-api-access-gjkrp\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.362956 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d9ed203b7d6becda58a39674bc31fd83e8f4d365ed794a7ac45649c9966741" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.363015 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-dblxr" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.659896 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l9464/crc-debug-ld7x6"] Sep 29 10:45:20 crc kubenswrapper[4922]: E0929 10:45:20.660544 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269731ec-c964-44ec-9167-9ced86b207a5" containerName="collect-profiles" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.660557 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="269731ec-c964-44ec-9167-9ced86b207a5" containerName="collect-profiles" Sep 29 10:45:20 crc kubenswrapper[4922]: E0929 10:45:20.660585 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd67cb2-d0ba-4420-be0b-a49ea8817669" containerName="container-00" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.660591 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd67cb2-d0ba-4420-be0b-a49ea8817669" containerName="container-00" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.660807 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd67cb2-d0ba-4420-be0b-a49ea8817669" containerName="container-00" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.660825 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="269731ec-c964-44ec-9167-9ced86b207a5" containerName="collect-profiles" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.661528 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.821555 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.821713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnvj\" (UniqueName: \"kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.923395 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnvj\" (UniqueName: \"kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.923485 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.923616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.950959 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnvj\" (UniqueName: \"kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj\") pod \"crc-debug-ld7x6\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:20 crc kubenswrapper[4922]: I0929 10:45:20.980593 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:21 crc kubenswrapper[4922]: I0929 10:45:21.372538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-ld7x6" event={"ID":"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02","Type":"ContainerStarted","Data":"40d974c544859195cccc1add38e6f3aedc77f08a877fc4cfe6339b5db460abd9"} Sep 29 10:45:21 crc kubenswrapper[4922]: I0929 10:45:21.373043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-ld7x6" event={"ID":"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02","Type":"ContainerStarted","Data":"477aecf2195959a8732168c17b3fdd275eef5b14f453a773e863c6e35efdf72b"} Sep 29 10:45:21 crc kubenswrapper[4922]: I0929 10:45:21.392382 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l9464/crc-debug-ld7x6" podStartSLOduration=1.392358244 podStartE2EDuration="1.392358244s" podCreationTimestamp="2025-09-29 10:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:45:21.387474025 +0000 UTC m=+3646.753704289" watchObservedRunningTime="2025-09-29 10:45:21.392358244 +0000 UTC m=+3646.758588508" Sep 29 10:45:21 crc kubenswrapper[4922]: I0929 10:45:21.464247 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd67cb2-d0ba-4420-be0b-a49ea8817669" path="/var/lib/kubelet/pods/4bd67cb2-d0ba-4420-be0b-a49ea8817669/volumes" Sep 29 10:45:22 crc kubenswrapper[4922]: I0929 10:45:22.384131 4922 generic.go:334] "Generic (PLEG): container finished" podID="fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" containerID="40d974c544859195cccc1add38e6f3aedc77f08a877fc4cfe6339b5db460abd9" exitCode=0 Sep 29 10:45:22 crc kubenswrapper[4922]: I0929 10:45:22.384189 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-ld7x6" event={"ID":"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02","Type":"ContainerDied","Data":"40d974c544859195cccc1add38e6f3aedc77f08a877fc4cfe6339b5db460abd9"} Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.494509 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.670288 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host\") pod \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.670382 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host" (OuterVolumeSpecName: "host") pod "fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" (UID: "fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.670451 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbnvj\" (UniqueName: \"kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj\") pod \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\" (UID: \"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02\") " Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.671154 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.675792 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj" (OuterVolumeSpecName: "kube-api-access-cbnvj") pod "fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" (UID: "fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02"). InnerVolumeSpecName "kube-api-access-cbnvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:23 crc kubenswrapper[4922]: I0929 10:45:23.772399 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbnvj\" (UniqueName: \"kubernetes.io/projected/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02-kube-api-access-cbnvj\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:24 crc kubenswrapper[4922]: I0929 10:45:24.400742 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-ld7x6" event={"ID":"fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02","Type":"ContainerDied","Data":"477aecf2195959a8732168c17b3fdd275eef5b14f453a773e863c6e35efdf72b"} Sep 29 10:45:24 crc kubenswrapper[4922]: I0929 10:45:24.401089 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="477aecf2195959a8732168c17b3fdd275eef5b14f453a773e863c6e35efdf72b" Sep 29 10:45:24 crc kubenswrapper[4922]: I0929 10:45:24.400817 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-ld7x6" Sep 29 10:45:27 crc kubenswrapper[4922]: I0929 10:45:27.873291 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l9464/crc-debug-ld7x6"] Sep 29 10:45:27 crc kubenswrapper[4922]: I0929 10:45:27.883752 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l9464/crc-debug-ld7x6"] Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.053171 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l9464/crc-debug-rnzqp"] Sep 29 10:45:29 crc kubenswrapper[4922]: E0929 10:45:29.053864 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" containerName="container-00" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.053878 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" containerName="container-00" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.054051 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" containerName="container-00" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.054666 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.161723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c58\" (UniqueName: \"kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.162196 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.264220 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.264379 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.264399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c58\" (UniqueName: \"kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.283223 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c58\" (UniqueName: \"kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58\") pod \"crc-debug-rnzqp\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.375171 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.447347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-rnzqp" event={"ID":"7d154b1b-3915-4722-a99d-687ddf64ecf7","Type":"ContainerStarted","Data":"11b57b132af48eac92705b8d947beff22074d80cfbea9619a8800c0fc710af58"} Sep 29 10:45:29 crc kubenswrapper[4922]: I0929 10:45:29.463080 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02" path="/var/lib/kubelet/pods/fb0e239a-c5cb-48c6-bed1-8bd7c94d3a02/volumes" Sep 29 10:45:30 crc kubenswrapper[4922]: I0929 10:45:30.459820 4922 generic.go:334] "Generic (PLEG): container finished" podID="7d154b1b-3915-4722-a99d-687ddf64ecf7" containerID="f6cc0b2a43eadaab6f64fca75add3cf903eb959c0309ddccfd82a426c2c8b2aa" exitCode=0 Sep 29 10:45:30 crc kubenswrapper[4922]: I0929 10:45:30.459890 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/crc-debug-rnzqp" event={"ID":"7d154b1b-3915-4722-a99d-687ddf64ecf7","Type":"ContainerDied","Data":"f6cc0b2a43eadaab6f64fca75add3cf903eb959c0309ddccfd82a426c2c8b2aa"} Sep 29 10:45:30 crc kubenswrapper[4922]: I0929 10:45:30.493625 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l9464/crc-debug-rnzqp"] Sep 29 10:45:30 crc kubenswrapper[4922]: I0929 10:45:30.503592 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l9464/crc-debug-rnzqp"] Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.453032 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:45:31 crc kubenswrapper[4922]: E0929 10:45:31.453295 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.602462 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.710393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host\") pod \"7d154b1b-3915-4722-a99d-687ddf64ecf7\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.711121 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8c58\" (UniqueName: \"kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58\") pod \"7d154b1b-3915-4722-a99d-687ddf64ecf7\" (UID: \"7d154b1b-3915-4722-a99d-687ddf64ecf7\") " Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.710546 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host" (OuterVolumeSpecName: "host") pod "7d154b1b-3915-4722-a99d-687ddf64ecf7" (UID: "7d154b1b-3915-4722-a99d-687ddf64ecf7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.711885 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d154b1b-3915-4722-a99d-687ddf64ecf7-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.717785 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58" (OuterVolumeSpecName: "kube-api-access-g8c58") pod "7d154b1b-3915-4722-a99d-687ddf64ecf7" (UID: "7d154b1b-3915-4722-a99d-687ddf64ecf7"). InnerVolumeSpecName "kube-api-access-g8c58". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.813685 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8c58\" (UniqueName: \"kubernetes.io/projected/7d154b1b-3915-4722-a99d-687ddf64ecf7-kube-api-access-g8c58\") on node \"crc\" DevicePath \"\"" Sep 29 10:45:31 crc kubenswrapper[4922]: I0929 10:45:31.946846 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-wxkn7_733a9696-fd92-42d4-b4df-6e4ba3d9d433/kube-rbac-proxy/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.034208 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-wxkn7_733a9696-fd92-42d4-b4df-6e4ba3d9d433/manager/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.142682 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-8vhzj_05dca5c7-0856-4c86-9bf8-99c6edc07252/kube-rbac-proxy/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.196056 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-8vhzj_05dca5c7-0856-4c86-9bf8-99c6edc07252/manager/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.314609 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-45kw4_2536f9c0-aac9-4d2c-be19-8afe9ac2e418/manager/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.321487 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-45kw4_2536f9c0-aac9-4d2c-be19-8afe9ac2e418/kube-rbac-proxy/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.402666 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.477600 4922 scope.go:117] "RemoveContainer" containerID="f6cc0b2a43eadaab6f64fca75add3cf903eb959c0309ddccfd82a426c2c8b2aa" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.477657 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/crc-debug-rnzqp" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.596938 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.613708 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.631972 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.810901 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.811034 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/extract/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.822691 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:45:32 crc kubenswrapper[4922]: I0929 10:45:32.972760 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-ph6mk_0ed6eee8-7938-4f36-98f8-99af2cc40a4e/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.041820 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-jhr5q_88c2443d-e9bf-441b-ae76-93b7f63c790b/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.064395 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-ph6mk_0ed6eee8-7938-4f36-98f8-99af2cc40a4e/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.198903 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-jhr5q_88c2443d-e9bf-441b-ae76-93b7f63c790b/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.213209 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-ft79c_aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.288273 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-ft79c_aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.410923 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-8pnnw_698d9305-27b8-44fe-bcd9-f034bdfa9b09/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.462247 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d154b1b-3915-4722-a99d-687ddf64ecf7" path="/var/lib/kubelet/pods/7d154b1b-3915-4722-a99d-687ddf64ecf7/volumes" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.541875 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-8pnnw_698d9305-27b8-44fe-bcd9-f034bdfa9b09/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.636539 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-jbs5x_fe4d01cb-1457-4cca-b2ed-7da6250a47df/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.668787 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-jbs5x_fe4d01cb-1457-4cca-b2ed-7da6250a47df/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.746391 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-xns9g_39c6dedb-23e2-4515-83c8-1e85e0136cc8/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.872356 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-xns9g_39c6dedb-23e2-4515-83c8-1e85e0136cc8/manager/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.915359 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-r8wxg_c9221095-3450-45f9-9aa2-e4994c8471ef/kube-rbac-proxy/0.log" Sep 29 10:45:33 crc kubenswrapper[4922]: I0929 10:45:33.959638 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-r8wxg_c9221095-3450-45f9-9aa2-e4994c8471ef/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.093543 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-b69kd_9c51299d-7ce3-4dff-b555-8cc2bcee6e4c/kube-rbac-proxy/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.108921 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-b69kd_9c51299d-7ce3-4dff-b555-8cc2bcee6e4c/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.257225 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-r4lcr_8ad2a8b0-1e70-47e2-80a1-139eedb15541/kube-rbac-proxy/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.313471 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-r4lcr_8ad2a8b0-1e70-47e2-80a1-139eedb15541/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.331878 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-knzwz_a01ec1f8-817f-4ed8-9431-01847d4956be/kube-rbac-proxy/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.513906 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-knzwz_a01ec1f8-817f-4ed8-9431-01847d4956be/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.522773 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-5wnjj_60630351-afcc-4792-bb16-5994368117cd/kube-rbac-proxy/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.553970 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-5wnjj_60630351-afcc-4792-bb16-5994368117cd/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.686967 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-p2hsw_2bbc35ab-8adc-445e-bc17-690ce9533a3e/kube-rbac-proxy/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.689269 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-p2hsw_2bbc35ab-8adc-445e-bc17-690ce9533a3e/manager/0.log" Sep 29 10:45:34 crc kubenswrapper[4922]: I0929 10:45:34.848340 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8d8dc476c-zjv29_00b67606-b7e3-4043-abff-cae6f14ba095/kube-rbac-proxy/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.001533 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7484b66f-slfdl_77c866e5-8ec4-47ac-809c-0fc002c47957/kube-rbac-proxy/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.238488 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7484b66f-slfdl_77c866e5-8ec4-47ac-809c-0fc002c47957/operator/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.244248 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-8r27w_822c4c9d-3c4c-43db-a891-19b9db1d279b/registry-server/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.320549 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-rvj4d_c4ba5f8a-ca61-4870-bc8e-017e79e139a5/kube-rbac-proxy/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.493754 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-pnns5_c38d04c4-b717-4155-b646-b06c3dac3386/kube-rbac-proxy/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.554031 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-rvj4d_c4ba5f8a-ca61-4870-bc8e-017e79e139a5/manager/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.594440 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-pnns5_c38d04c4-b717-4155-b646-b06c3dac3386/manager/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.742136 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-lspp8_c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3/operator/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.817043 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-82g7r_71aa3678-e2c4-4a23-9e66-738fddb6066f/kube-rbac-proxy/0.log" Sep 29 10:45:35 crc kubenswrapper[4922]: I0929 10:45:35.965304 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-82g7r_71aa3678-e2c4-4a23-9e66-738fddb6066f/manager/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.027145 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-5s5wf_f4fbefa3-c5d4-4a51-b90a-512ebfcef863/kube-rbac-proxy/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.105227 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8d8dc476c-zjv29_00b67606-b7e3-4043-abff-cae6f14ba095/manager/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.134867 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-5s5wf_f4fbefa3-c5d4-4a51-b90a-512ebfcef863/manager/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.196506 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-lfh4h_97df5e99-5243-4552-ab72-7c6526deea11/kube-rbac-proxy/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.252728 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-lfh4h_97df5e99-5243-4552-ab72-7c6526deea11/manager/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.307676 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-lcd9t_0d5d466f-e41b-42c4-91ff-11e84d297b5d/kube-rbac-proxy/0.log" Sep 29 10:45:36 crc kubenswrapper[4922]: I0929 10:45:36.377928 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-lcd9t_0d5d466f-e41b-42c4-91ff-11e84d297b5d/manager/0.log" Sep 29 10:45:45 crc kubenswrapper[4922]: I0929 10:45:45.457796 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:45:45 crc kubenswrapper[4922]: E0929 10:45:45.458640 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:45:49 crc kubenswrapper[4922]: I0929 10:45:49.895166 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cftkt_b2ea2f47-4732-47bd-9099-c503b5610f43/control-plane-machine-set-operator/0.log" Sep 29 10:45:50 crc kubenswrapper[4922]: I0929 10:45:50.054937 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dbc7l_dea0217e-c923-4045-9b4f-90a9eff30f93/machine-api-operator/0.log" Sep 29 10:45:50 crc kubenswrapper[4922]: I0929 10:45:50.097583 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dbc7l_dea0217e-c923-4045-9b4f-90a9eff30f93/kube-rbac-proxy/0.log" Sep 29 10:45:52 crc kubenswrapper[4922]: I0929 10:45:52.293212 4922 scope.go:117] "RemoveContainer" containerID="40f68fa2354de3ddc41781cf3181216cd74d2a28888e580f2819a08cf20cc97d" Sep 29 10:46:00 crc kubenswrapper[4922]: I0929 10:46:00.453176 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:46:00 crc kubenswrapper[4922]: E0929 10:46:00.454375 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:46:00 crc kubenswrapper[4922]: I0929 10:46:00.712913 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-snmmn_7626bb44-b67d-42eb-b912-a9b279f7157d/cert-manager-controller/0.log" Sep 29 10:46:00 crc kubenswrapper[4922]: I0929 10:46:00.876939 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q9z92_c9cbdf04-ea9d-4663-94c9-345fb63f3f9c/cert-manager-cainjector/0.log" Sep 29 10:46:00 crc kubenswrapper[4922]: I0929 10:46:00.922536 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-x5g9r_93e3a2dc-f64b-4766-b851-2faa2e57c4f4/cert-manager-webhook/0.log" Sep 29 10:46:11 crc kubenswrapper[4922]: I0929 10:46:11.600474 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-gc2pq_bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72/nmstate-console-plugin/0.log" Sep 29 10:46:11 crc kubenswrapper[4922]: I0929 10:46:11.813711 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sp9q9_bf8e5bd0-e08e-4818-843b-30f7c956626f/nmstate-handler/0.log" Sep 29 10:46:11 crc kubenswrapper[4922]: I0929 10:46:11.839758 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5cfzj_8fbc50c8-5afc-4ad5-888b-167e84fa22d0/kube-rbac-proxy/0.log" Sep 29 10:46:11 crc kubenswrapper[4922]: I0929 10:46:11.870842 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5cfzj_8fbc50c8-5afc-4ad5-888b-167e84fa22d0/nmstate-metrics/0.log" Sep 29 10:46:12 crc kubenswrapper[4922]: I0929 10:46:12.033511 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-hrqwq_485426b6-cad6-4591-beaf-d8bb33f79ea1/nmstate-operator/0.log" Sep 29 10:46:12 crc kubenswrapper[4922]: I0929 10:46:12.112396 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-99cmf_430f697e-6b89-4db1-91a8-194c8a7af724/nmstate-webhook/0.log" Sep 29 10:46:13 crc kubenswrapper[4922]: I0929 10:46:13.452580 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:46:13 crc kubenswrapper[4922]: E0929 10:46:13.453135 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:46:25 crc kubenswrapper[4922]: I0929 10:46:25.743545 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-vz5n2_7732c258-d416-45bd-92a4-1a852c9bf4e6/kube-rbac-proxy/0.log" Sep 29 10:46:25 crc kubenswrapper[4922]: I0929 10:46:25.892682 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-vz5n2_7732c258-d416-45bd-92a4-1a852c9bf4e6/controller/0.log" Sep 29 10:46:25 crc kubenswrapper[4922]: I0929 10:46:25.958283 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.199328 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.235823 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.259850 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.275244 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.498521 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.550330 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.553055 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.567979 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.741161 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.775984 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.828478 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/controller/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.838619 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:46:26 crc kubenswrapper[4922]: I0929 10:46:26.991265 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/frr-metrics/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.050152 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/kube-rbac-proxy-frr/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.054504 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/kube-rbac-proxy/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.231075 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/reloader/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.231814 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-p9x6q_fe4bf7c5-f4cf-4c2c-9075-14c39b06297d/frr-k8s-webhook-server/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.476751 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6bfbf6b8fd-w44ps_74df0c3b-e3ed-4061-9fb2-a9a830974755/manager/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.644161 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654f6f79d6-9gbmz_2d2222f4-496b-4cbf-883c-e3ac89e08a79/webhook-server/0.log" Sep 29 10:46:27 crc kubenswrapper[4922]: I0929 10:46:27.765320 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jp26n_3b15f008-2077-4246-af46-d39384412fa5/kube-rbac-proxy/0.log" Sep 29 10:46:28 crc kubenswrapper[4922]: I0929 10:46:28.452804 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:46:28 crc kubenswrapper[4922]: E0929 10:46:28.453069 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:46:28 crc kubenswrapper[4922]: I0929 10:46:28.456676 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jp26n_3b15f008-2077-4246-af46-d39384412fa5/speaker/0.log" Sep 29 10:46:28 crc kubenswrapper[4922]: I0929 10:46:28.465624 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/frr/0.log" Sep 29 10:46:39 crc kubenswrapper[4922]: I0929 10:46:39.723436 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:46:39 crc kubenswrapper[4922]: I0929 10:46:39.871575 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:46:39 crc kubenswrapper[4922]: I0929 10:46:39.900061 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:46:39 crc kubenswrapper[4922]: I0929 10:46:39.927529 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.098373 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.102194 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.109175 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/extract/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.281290 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.450101 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.459288 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.473775 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.671310 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.692640 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:46:40 crc kubenswrapper[4922]: I0929 10:46:40.930067 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.174459 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.186493 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.192030 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.208865 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/registry-server/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.356097 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.357258 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.561264 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.840120 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.842154 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:46:41 crc kubenswrapper[4922]: I0929 10:46:41.849923 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.032275 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/registry-server/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.140935 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.149970 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/extract/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.168991 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.320966 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.324344 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fkw4t_69498aa6-9b16-42bd-97f7-f3f52b763788/marketplace-operator/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.521038 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.537430 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.559279 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.710316 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.717413 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.912118 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/registry-server/0.log" Sep 29 10:46:42 crc kubenswrapper[4922]: I0929 10:46:42.934599 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.097283 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.120906 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.129047 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.280809 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.325771 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.451501 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:46:43 crc kubenswrapper[4922]: E0929 10:46:43.451766 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:46:43 crc kubenswrapper[4922]: I0929 10:46:43.829608 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/registry-server/0.log" Sep 29 10:46:55 crc kubenswrapper[4922]: I0929 10:46:55.459628 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:46:55 crc kubenswrapper[4922]: E0929 10:46:55.460474 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:47:11 crc kubenswrapper[4922]: I0929 10:47:11.452129 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:47:11 crc kubenswrapper[4922]: E0929 10:47:11.453049 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:47:22 crc kubenswrapper[4922]: I0929 10:47:22.452997 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:47:22 crc kubenswrapper[4922]: E0929 10:47:22.454721 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:47:35 crc kubenswrapper[4922]: I0929 10:47:35.460293 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:47:35 crc kubenswrapper[4922]: E0929 10:47:35.467485 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:47:46 crc kubenswrapper[4922]: I0929 10:47:46.453181 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:47:46 crc kubenswrapper[4922]: E0929 10:47:46.454258 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.301079 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:47:50 crc kubenswrapper[4922]: E0929 10:47:50.306049 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d154b1b-3915-4722-a99d-687ddf64ecf7" containerName="container-00" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.306357 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d154b1b-3915-4722-a99d-687ddf64ecf7" containerName="container-00" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.307100 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d154b1b-3915-4722-a99d-687ddf64ecf7" containerName="container-00" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.311503 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.330605 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.341735 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fztbn\" (UniqueName: \"kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.341859 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.341913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.444284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.444739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.444798 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.444954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fztbn\" (UniqueName: \"kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.445283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.467024 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fztbn\" (UniqueName: \"kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn\") pod \"community-operators-647hb\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:50 crc kubenswrapper[4922]: I0929 10:47:50.649472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:47:51 crc kubenswrapper[4922]: W0929 10:47:51.170750 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93a28d4_980a_492b_9f00_a776bf900190.slice/crio-ee6ed83a570fc1cd82c2f2a19e2add3a5e5afb595de26922ed13a051dd791007 WatchSource:0}: Error finding container ee6ed83a570fc1cd82c2f2a19e2add3a5e5afb595de26922ed13a051dd791007: Status 404 returned error can't find the container with id ee6ed83a570fc1cd82c2f2a19e2add3a5e5afb595de26922ed13a051dd791007 Sep 29 10:47:51 crc kubenswrapper[4922]: I0929 10:47:51.171974 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:47:51 crc kubenswrapper[4922]: I0929 10:47:51.756730 4922 generic.go:334] "Generic (PLEG): container finished" podID="d93a28d4-980a-492b-9f00-a776bf900190" containerID="c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e" exitCode=0 Sep 29 10:47:51 crc kubenswrapper[4922]: I0929 10:47:51.756785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerDied","Data":"c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e"} Sep 29 10:47:51 crc kubenswrapper[4922]: I0929 10:47:51.756819 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerStarted","Data":"ee6ed83a570fc1cd82c2f2a19e2add3a5e5afb595de26922ed13a051dd791007"} Sep 29 10:47:51 crc kubenswrapper[4922]: I0929 10:47:51.762191 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 10:47:52 crc kubenswrapper[4922]: I0929 10:47:52.767402 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerStarted","Data":"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc"} Sep 29 10:47:53 crc kubenswrapper[4922]: I0929 10:47:53.776892 4922 generic.go:334] "Generic (PLEG): container finished" podID="d93a28d4-980a-492b-9f00-a776bf900190" containerID="db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc" exitCode=0 Sep 29 10:47:53 crc kubenswrapper[4922]: I0929 10:47:53.776956 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerDied","Data":"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc"} Sep 29 10:47:54 crc kubenswrapper[4922]: I0929 10:47:54.788516 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerStarted","Data":"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5"} Sep 29 10:47:58 crc kubenswrapper[4922]: I0929 10:47:58.452381 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:47:58 crc kubenswrapper[4922]: E0929 10:47:58.453179 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.650014 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.650333 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.693555 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.711179 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-647hb" podStartSLOduration=8.239153126 podStartE2EDuration="10.711157845s" podCreationTimestamp="2025-09-29 10:47:50 +0000 UTC" firstStartedPulling="2025-09-29 10:47:51.761480488 +0000 UTC m=+3797.127710802" lastFinishedPulling="2025-09-29 10:47:54.233485257 +0000 UTC m=+3799.599715521" observedRunningTime="2025-09-29 10:47:54.815258442 +0000 UTC m=+3800.181488706" watchObservedRunningTime="2025-09-29 10:48:00.711157845 +0000 UTC m=+3806.077388109" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.896329 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:00 crc kubenswrapper[4922]: I0929 10:48:00.940080 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:48:02 crc kubenswrapper[4922]: I0929 10:48:02.858891 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-647hb" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="registry-server" containerID="cri-o://2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5" gracePeriod=2 Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.316446 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.499770 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content\") pod \"d93a28d4-980a-492b-9f00-a776bf900190\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.500123 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities\") pod \"d93a28d4-980a-492b-9f00-a776bf900190\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.500326 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fztbn\" (UniqueName: \"kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn\") pod \"d93a28d4-980a-492b-9f00-a776bf900190\" (UID: \"d93a28d4-980a-492b-9f00-a776bf900190\") " Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.501876 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities" (OuterVolumeSpecName: "utilities") pod "d93a28d4-980a-492b-9f00-a776bf900190" (UID: "d93a28d4-980a-492b-9f00-a776bf900190"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.510003 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn" (OuterVolumeSpecName: "kube-api-access-fztbn") pod "d93a28d4-980a-492b-9f00-a776bf900190" (UID: "d93a28d4-980a-492b-9f00-a776bf900190"). InnerVolumeSpecName "kube-api-access-fztbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.544363 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d93a28d4-980a-492b-9f00-a776bf900190" (UID: "d93a28d4-980a-492b-9f00-a776bf900190"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.602943 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.602991 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fztbn\" (UniqueName: \"kubernetes.io/projected/d93a28d4-980a-492b-9f00-a776bf900190-kube-api-access-fztbn\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.603006 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93a28d4-980a-492b-9f00-a776bf900190-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.871551 4922 generic.go:334] "Generic (PLEG): container finished" podID="d93a28d4-980a-492b-9f00-a776bf900190" containerID="2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5" exitCode=0 Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.871603 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerDied","Data":"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5"} Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.871625 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-647hb" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.871647 4922 scope.go:117] "RemoveContainer" containerID="2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.871635 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-647hb" event={"ID":"d93a28d4-980a-492b-9f00-a776bf900190","Type":"ContainerDied","Data":"ee6ed83a570fc1cd82c2f2a19e2add3a5e5afb595de26922ed13a051dd791007"} Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.892442 4922 scope.go:117] "RemoveContainer" containerID="db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.915952 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.924464 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-647hb"] Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.934260 4922 scope.go:117] "RemoveContainer" containerID="c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.961048 4922 scope.go:117] "RemoveContainer" containerID="2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5" Sep 29 10:48:03 crc kubenswrapper[4922]: E0929 10:48:03.961558 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5\": container with ID starting with 2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5 not found: ID does not exist" containerID="2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.961594 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5"} err="failed to get container status \"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5\": rpc error: code = NotFound desc = could not find container \"2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5\": container with ID starting with 2e8ec107ac8e0b2a181d6d2fb53dcef39f1c4266bd9df3b1dd5523f3d87375b5 not found: ID does not exist" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.961618 4922 scope.go:117] "RemoveContainer" containerID="db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc" Sep 29 10:48:03 crc kubenswrapper[4922]: E0929 10:48:03.962019 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc\": container with ID starting with db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc not found: ID does not exist" containerID="db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.962066 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc"} err="failed to get container status \"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc\": rpc error: code = NotFound desc = could not find container \"db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc\": container with ID starting with db2bddac4c8eb40f287b2be82810839b132d39cc29a9b846d5e1ce0cd08d50cc not found: ID does not exist" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.962100 4922 scope.go:117] "RemoveContainer" containerID="c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e" Sep 29 10:48:03 crc kubenswrapper[4922]: E0929 10:48:03.962612 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e\": container with ID starting with c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e not found: ID does not exist" containerID="c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e" Sep 29 10:48:03 crc kubenswrapper[4922]: I0929 10:48:03.962655 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e"} err="failed to get container status \"c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e\": rpc error: code = NotFound desc = could not find container \"c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e\": container with ID starting with c76bedff5d25677e9de5e0c4b8fbed00ea0fe364b218159ea7d2d0d9376b4a9e not found: ID does not exist" Sep 29 10:48:05 crc kubenswrapper[4922]: I0929 10:48:05.463585 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93a28d4-980a-492b-9f00-a776bf900190" path="/var/lib/kubelet/pods/d93a28d4-980a-492b-9f00-a776bf900190/volumes" Sep 29 10:48:09 crc kubenswrapper[4922]: I0929 10:48:09.453113 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:48:09 crc kubenswrapper[4922]: E0929 10:48:09.454081 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:48:23 crc kubenswrapper[4922]: I0929 10:48:23.454507 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:48:23 crc kubenswrapper[4922]: E0929 10:48:23.455326 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:48:38 crc kubenswrapper[4922]: I0929 10:48:38.452675 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:48:38 crc kubenswrapper[4922]: E0929 10:48:38.453765 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:48:44 crc kubenswrapper[4922]: I0929 10:48:44.269214 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerID="651b84d4d0846c5ca6f5535eada2c3a66f380d007edbb0f9fb82fdcfb762da52" exitCode=0 Sep 29 10:48:44 crc kubenswrapper[4922]: I0929 10:48:44.269301 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l9464/must-gather-rprqs" event={"ID":"2a4eb407-0186-4d8e-8df5-e301333ff978","Type":"ContainerDied","Data":"651b84d4d0846c5ca6f5535eada2c3a66f380d007edbb0f9fb82fdcfb762da52"} Sep 29 10:48:44 crc kubenswrapper[4922]: I0929 10:48:44.270474 4922 scope.go:117] "RemoveContainer" containerID="651b84d4d0846c5ca6f5535eada2c3a66f380d007edbb0f9fb82fdcfb762da52" Sep 29 10:48:45 crc kubenswrapper[4922]: I0929 10:48:45.176027 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l9464_must-gather-rprqs_2a4eb407-0186-4d8e-8df5-e301333ff978/gather/0.log" Sep 29 10:48:50 crc kubenswrapper[4922]: I0929 10:48:50.452496 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:48:50 crc kubenswrapper[4922]: E0929 10:48:50.453407 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.225200 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l9464/must-gather-rprqs"] Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.225807 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l9464/must-gather-rprqs" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="copy" containerID="cri-o://73487934a7c4e7c71344da42daed43b1ef4b431438c9bab85dac3a30c4e858a3" gracePeriod=2 Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.235606 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l9464/must-gather-rprqs"] Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.354200 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l9464_must-gather-rprqs_2a4eb407-0186-4d8e-8df5-e301333ff978/copy/0.log" Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.354826 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerID="73487934a7c4e7c71344da42daed43b1ef4b431438c9bab85dac3a30c4e858a3" exitCode=143 Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.757804 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l9464_must-gather-rprqs_2a4eb407-0186-4d8e-8df5-e301333ff978/copy/0.log" Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.758389 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.866009 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsrn\" (UniqueName: \"kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn\") pod \"2a4eb407-0186-4d8e-8df5-e301333ff978\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.866165 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output\") pod \"2a4eb407-0186-4d8e-8df5-e301333ff978\" (UID: \"2a4eb407-0186-4d8e-8df5-e301333ff978\") " Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.885754 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn" (OuterVolumeSpecName: "kube-api-access-kdsrn") pod "2a4eb407-0186-4d8e-8df5-e301333ff978" (UID: "2a4eb407-0186-4d8e-8df5-e301333ff978"). InnerVolumeSpecName "kube-api-access-kdsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:48:53 crc kubenswrapper[4922]: I0929 10:48:53.968254 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsrn\" (UniqueName: \"kubernetes.io/projected/2a4eb407-0186-4d8e-8df5-e301333ff978-kube-api-access-kdsrn\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.035358 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2a4eb407-0186-4d8e-8df5-e301333ff978" (UID: "2a4eb407-0186-4d8e-8df5-e301333ff978"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.070422 4922 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a4eb407-0186-4d8e-8df5-e301333ff978-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.363999 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l9464_must-gather-rprqs_2a4eb407-0186-4d8e-8df5-e301333ff978/copy/0.log" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.365432 4922 scope.go:117] "RemoveContainer" containerID="73487934a7c4e7c71344da42daed43b1ef4b431438c9bab85dac3a30c4e858a3" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.365480 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l9464/must-gather-rprqs" Sep 29 10:48:54 crc kubenswrapper[4922]: I0929 10:48:54.384353 4922 scope.go:117] "RemoveContainer" containerID="651b84d4d0846c5ca6f5535eada2c3a66f380d007edbb0f9fb82fdcfb762da52" Sep 29 10:48:55 crc kubenswrapper[4922]: I0929 10:48:55.461433 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" path="/var/lib/kubelet/pods/2a4eb407-0186-4d8e-8df5-e301333ff978/volumes" Sep 29 10:49:05 crc kubenswrapper[4922]: I0929 10:49:05.460217 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:49:05 crc kubenswrapper[4922]: E0929 10:49:05.460768 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:49:17 crc kubenswrapper[4922]: I0929 10:49:17.452243 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:49:17 crc kubenswrapper[4922]: E0929 10:49:17.453490 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.163320 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrncb/must-gather-n9z8d"] Sep 29 10:49:22 crc kubenswrapper[4922]: E0929 10:49:22.164473 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="copy" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164491 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="copy" Sep 29 10:49:22 crc kubenswrapper[4922]: E0929 10:49:22.164511 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="registry-server" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164518 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="registry-server" Sep 29 10:49:22 crc kubenswrapper[4922]: E0929 10:49:22.164543 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="extract-content" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164556 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="extract-content" Sep 29 10:49:22 crc kubenswrapper[4922]: E0929 10:49:22.164580 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="extract-utilities" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164588 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="extract-utilities" Sep 29 10:49:22 crc kubenswrapper[4922]: E0929 10:49:22.164610 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="gather" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164617 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="gather" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164870 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="gather" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164898 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4eb407-0186-4d8e-8df5-e301333ff978" containerName="copy" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.164908 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93a28d4-980a-492b-9f00-a776bf900190" containerName="registry-server" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.166169 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.169358 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zrncb"/"openshift-service-ca.crt" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.173952 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zrncb"/"default-dockercfg-7t8pr" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.174105 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zrncb"/"kube-root-ca.crt" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.175723 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zrncb/must-gather-n9z8d"] Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.212541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.212922 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hgh2\" (UniqueName: \"kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.314660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hgh2\" (UniqueName: \"kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.314806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.315284 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.343787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hgh2\" (UniqueName: \"kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2\") pod \"must-gather-n9z8d\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.488586 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:49:22 crc kubenswrapper[4922]: I0929 10:49:22.927511 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zrncb/must-gather-n9z8d"] Sep 29 10:49:23 crc kubenswrapper[4922]: I0929 10:49:23.619906 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/must-gather-n9z8d" event={"ID":"079038bc-a923-488e-9c8c-a2729ab2c150","Type":"ContainerStarted","Data":"8d28d51cf00ce452830db9e119bd1771724c2732550982821a57fcf993b02e51"} Sep 29 10:49:23 crc kubenswrapper[4922]: I0929 10:49:23.620247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/must-gather-n9z8d" event={"ID":"079038bc-a923-488e-9c8c-a2729ab2c150","Type":"ContainerStarted","Data":"8555344e13e12465d043e8f273f97cba4588017c6e374a1b1cee2e75a167959a"} Sep 29 10:49:23 crc kubenswrapper[4922]: I0929 10:49:23.620261 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/must-gather-n9z8d" event={"ID":"079038bc-a923-488e-9c8c-a2729ab2c150","Type":"ContainerStarted","Data":"abc22e3d13152db2f61e77c485b3f8f36375f8f83b592812f263a3604f6c724e"} Sep 29 10:49:23 crc kubenswrapper[4922]: I0929 10:49:23.637754 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrncb/must-gather-n9z8d" podStartSLOduration=1.637730902 podStartE2EDuration="1.637730902s" podCreationTimestamp="2025-09-29 10:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:49:23.634185854 +0000 UTC m=+3889.000416128" watchObservedRunningTime="2025-09-29 10:49:23.637730902 +0000 UTC m=+3889.003961166" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.450116 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrncb/crc-debug-589gh"] Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.451722 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.494289 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.494351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85kl9\" (UniqueName: \"kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.596146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.596207 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85kl9\" (UniqueName: \"kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.596311 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.617149 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85kl9\" (UniqueName: \"kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9\") pod \"crc-debug-589gh\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:26 crc kubenswrapper[4922]: I0929 10:49:26.776559 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:49:27 crc kubenswrapper[4922]: I0929 10:49:27.656381 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-589gh" event={"ID":"dc909a37-c2d0-4885-8811-3661f036b134","Type":"ContainerStarted","Data":"e707507c3e1a01a7fb064a69947360ce4fbd48538f9df2e34588931d8c02b4f9"} Sep 29 10:49:27 crc kubenswrapper[4922]: I0929 10:49:27.657036 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-589gh" event={"ID":"dc909a37-c2d0-4885-8811-3661f036b134","Type":"ContainerStarted","Data":"cd53753343b7c07b530dd112381de357d9920bce523cd1df018ff215d0fb9d48"} Sep 29 10:49:27 crc kubenswrapper[4922]: I0929 10:49:27.673414 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrncb/crc-debug-589gh" podStartSLOduration=1.673393371 podStartE2EDuration="1.673393371s" podCreationTimestamp="2025-09-29 10:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:49:27.668747757 +0000 UTC m=+3893.034978041" watchObservedRunningTime="2025-09-29 10:49:27.673393371 +0000 UTC m=+3893.039623635" Sep 29 10:49:29 crc kubenswrapper[4922]: I0929 10:49:29.452715 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:49:30 crc kubenswrapper[4922]: I0929 10:49:30.684089 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91"} Sep 29 10:49:52 crc kubenswrapper[4922]: I0929 10:49:52.422659 4922 scope.go:117] "RemoveContainer" containerID="955dfc09f81346f263009decf4b3fbd3f4fde4a78c4e559d620a34cb8351f457" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.624004 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.627860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.639384 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.654322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.654670 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.655345 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgrr\" (UniqueName: \"kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.758380 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.758524 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgrr\" (UniqueName: \"kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.758700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.759786 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.760466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.796413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgrr\" (UniqueName: \"kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr\") pod \"redhat-marketplace-zmkx9\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:17 crc kubenswrapper[4922]: I0929 10:50:17.947477 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:18 crc kubenswrapper[4922]: I0929 10:50:18.483606 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:19 crc kubenswrapper[4922]: I0929 10:50:19.169084 4922 generic.go:334] "Generic (PLEG): container finished" podID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerID="ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf" exitCode=0 Sep 29 10:50:19 crc kubenswrapper[4922]: I0929 10:50:19.169172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerDied","Data":"ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf"} Sep 29 10:50:19 crc kubenswrapper[4922]: I0929 10:50:19.169627 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerStarted","Data":"0f897a288236177464dd3430f4a71f56a74524de27ca678538cf8b0a5167f525"} Sep 29 10:50:20 crc kubenswrapper[4922]: I0929 10:50:20.187936 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerStarted","Data":"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef"} Sep 29 10:50:21 crc kubenswrapper[4922]: I0929 10:50:21.219295 4922 generic.go:334] "Generic (PLEG): container finished" podID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerID="090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef" exitCode=0 Sep 29 10:50:21 crc kubenswrapper[4922]: I0929 10:50:21.219396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerDied","Data":"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef"} Sep 29 10:50:22 crc kubenswrapper[4922]: I0929 10:50:22.230371 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerStarted","Data":"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f"} Sep 29 10:50:22 crc kubenswrapper[4922]: I0929 10:50:22.254912 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zmkx9" podStartSLOduration=2.630039369 podStartE2EDuration="5.254893009s" podCreationTimestamp="2025-09-29 10:50:17 +0000 UTC" firstStartedPulling="2025-09-29 10:50:19.170804192 +0000 UTC m=+3944.537034446" lastFinishedPulling="2025-09-29 10:50:21.795657822 +0000 UTC m=+3947.161888086" observedRunningTime="2025-09-29 10:50:22.250126971 +0000 UTC m=+3947.616357245" watchObservedRunningTime="2025-09-29 10:50:22.254893009 +0000 UTC m=+3947.621123273" Sep 29 10:50:27 crc kubenswrapper[4922]: I0929 10:50:27.948451 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:27 crc kubenswrapper[4922]: I0929 10:50:27.950022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:27.999983 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.336165 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.390810 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.452207 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9c9d67f9d-vs7st_c3007912-e64d-4325-beb8-fa3c2dfcbe5e/barbican-api/0.log" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.468847 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9c9d67f9d-vs7st_c3007912-e64d-4325-beb8-fa3c2dfcbe5e/barbican-api-log/0.log" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.679010 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cdd7877d-rvfhb_541f048f-4db6-45d6-aaa2-659dc9ff0b86/barbican-keystone-listener/0.log" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.709116 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cdd7877d-rvfhb_541f048f-4db6-45d6-aaa2-659dc9ff0b86/barbican-keystone-listener-log/0.log" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.879952 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59bb77879f-bdcc9_f0d2cc2a-cdf2-490c-a56b-48977a5d83e0/barbican-worker/0.log" Sep 29 10:50:28 crc kubenswrapper[4922]: I0929 10:50:28.926138 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-59bb77879f-bdcc9_f0d2cc2a-cdf2-490c-a56b-48977a5d83e0/barbican-worker-log/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.114742 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cx72z_9c5d1232-a030-44f4-823e-5c806d5dd896/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.288656 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/ceilometer-central-agent/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.314067 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/ceilometer-notification-agent/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.358285 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/proxy-httpd/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.628000 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ede8dfdb-116f-4e02-8408-aea659020067/sg-core/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.746106 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_34ed6be9-1694-4866-a437-36f08027b85f/cinder-api/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.816676 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_34ed6be9-1694-4866-a437-36f08027b85f/cinder-api-log/0.log" Sep 29 10:50:29 crc kubenswrapper[4922]: I0929 10:50:29.999309 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9172c63-e9e9-43c9-a804-72410f85eefe/cinder-scheduler/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.063751 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9172c63-e9e9-43c9-a804-72410f85eefe/probe/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.218736 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6tvr9_b7891137-eddd-4865-9a35-f32a72a1f206/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.306387 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zmkx9" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="registry-server" containerID="cri-o://ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f" gracePeriod=2 Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.375339 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nc6vc_8653f711-4f91-4ce3-a900-95aa54ac26a1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.533684 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/init/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.816702 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.831386 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/init/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.898911 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6l2dp_fd5275e6-c3d3-474d-962a-3cdafc893dfd/dnsmasq-dns/0.log" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.927507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities\") pod \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.927793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content\") pod \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.927902 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csgrr\" (UniqueName: \"kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr\") pod \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\" (UID: \"cf04013f-1577-4f1b-a8ad-cf2a0151c317\") " Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.928490 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities" (OuterVolumeSpecName: "utilities") pod "cf04013f-1577-4f1b-a8ad-cf2a0151c317" (UID: "cf04013f-1577-4f1b-a8ad-cf2a0151c317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.941322 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf04013f-1577-4f1b-a8ad-cf2a0151c317" (UID: "cf04013f-1577-4f1b-a8ad-cf2a0151c317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:50:30 crc kubenswrapper[4922]: I0929 10:50:30.952093 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr" (OuterVolumeSpecName: "kube-api-access-csgrr") pod "cf04013f-1577-4f1b-a8ad-cf2a0151c317" (UID: "cf04013f-1577-4f1b-a8ad-cf2a0151c317"). InnerVolumeSpecName "kube-api-access-csgrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.030623 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.030672 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csgrr\" (UniqueName: \"kubernetes.io/projected/cf04013f-1577-4f1b-a8ad-cf2a0151c317-kube-api-access-csgrr\") on node \"crc\" DevicePath \"\"" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.030686 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf04013f-1577-4f1b-a8ad-cf2a0151c317-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.058520 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gngd5_3cbc70f7-2707-430a-a8d1-d33aee8c7ae8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.187363 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4da22caf-781b-42ef-ad66-521d0908aabb/glance-httpd/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.264298 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4da22caf-781b-42ef-ad66-521d0908aabb/glance-log/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.317701 4922 generic.go:334] "Generic (PLEG): container finished" podID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerID="ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f" exitCode=0 Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.317739 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerDied","Data":"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f"} Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.317765 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmkx9" event={"ID":"cf04013f-1577-4f1b-a8ad-cf2a0151c317","Type":"ContainerDied","Data":"0f897a288236177464dd3430f4a71f56a74524de27ca678538cf8b0a5167f525"} Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.317773 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmkx9" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.317782 4922 scope.go:117] "RemoveContainer" containerID="ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.340389 4922 scope.go:117] "RemoveContainer" containerID="090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.362740 4922 scope.go:117] "RemoveContainer" containerID="ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.363729 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.383910 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmkx9"] Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.404600 4922 scope.go:117] "RemoveContainer" containerID="ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f" Sep 29 10:50:31 crc kubenswrapper[4922]: E0929 10:50:31.406070 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f\": container with ID starting with ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f not found: ID does not exist" containerID="ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.406122 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f"} err="failed to get container status \"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f\": rpc error: code = NotFound desc = could not find container \"ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f\": container with ID starting with ea6e4785c4d0e97f8e8b8d6ecde5edab586d6a3b11637979801fe0f672bddd5f not found: ID does not exist" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.406150 4922 scope.go:117] "RemoveContainer" containerID="090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef" Sep 29 10:50:31 crc kubenswrapper[4922]: E0929 10:50:31.406549 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef\": container with ID starting with 090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef not found: ID does not exist" containerID="090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.406582 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef"} err="failed to get container status \"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef\": rpc error: code = NotFound desc = could not find container \"090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef\": container with ID starting with 090f696c0beaaffb3e6f06c72c780dddf228c3cf439b0a3904b7cc26e5b799ef not found: ID does not exist" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.406623 4922 scope.go:117] "RemoveContainer" containerID="ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf" Sep 29 10:50:31 crc kubenswrapper[4922]: E0929 10:50:31.406963 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf\": container with ID starting with ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf not found: ID does not exist" containerID="ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.407018 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf"} err="failed to get container status \"ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf\": rpc error: code = NotFound desc = could not find container \"ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf\": container with ID starting with ac9560ec9371c182fb3811be800570772ba68dfd1de09d1f4f56de17b407c1bf not found: ID does not exist" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.448779 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5ec5073f-9a07-4292-8cfb-62e419a0438d/glance-httpd/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.461518 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" path="/var/lib/kubelet/pods/cf04013f-1577-4f1b-a8ad-cf2a0151c317/volumes" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.491725 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5ec5073f-9a07-4292-8cfb-62e419a0438d/glance-log/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.690335 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58b957f588-sp2bt_84f21d67-d595-4458-871c-e4bbc362b134/horizon/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.962825 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5sq24_d69265aa-752a-4d25-9af4-6dd389d13e8a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:31 crc kubenswrapper[4922]: I0929 10:50:31.998216 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7wdvn_425016bd-6178-497e-ad2b-e150d1cf141f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.059105 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58b957f588-sp2bt_84f21d67-d595-4458-871c-e4bbc362b134/horizon-log/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.246505 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8000729b-19d9-47cd-baa5-7ee4bab9cc04/kube-state-metrics/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.333210 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79f578d789-bbw9r_858b0ba1-ffa7-46f1-a8fa-a404c1cdcf99/keystone-api/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.486330 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vvllg_bcdc9bf2-2da5-4261-89b5-dd6111d25d3b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.782134 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8b5fcf5f9-p74mm_59b8f377-8449-49f6-992b-6b76ef613283/neutron-httpd/0.log" Sep 29 10:50:32 crc kubenswrapper[4922]: I0929 10:50:32.885635 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8b5fcf5f9-p74mm_59b8f377-8449-49f6-992b-6b76ef613283/neutron-api/0.log" Sep 29 10:50:33 crc kubenswrapper[4922]: I0929 10:50:33.058983 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-99gff_f2ca0ea5-d1da-4a19-8058-4ea277fcb4d7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:33 crc kubenswrapper[4922]: I0929 10:50:33.923874 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18e1ff02-c0b1-4095-a40d-b9dc5a492de4/nova-api-log/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.119052 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e1139d93-2038-4fa3-b31c-1e7ddedd0bb7/nova-cell0-conductor-conductor/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.299701 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_18e1ff02-c0b1-4095-a40d-b9dc5a492de4/nova-api-api/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.563555 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4d6968da-188d-461e-ab0f-00bf3e2fab0c/nova-cell1-conductor-conductor/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.686013 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c69f22d7-5c9b-4f45-bb81-1a5ff8ac8528/nova-cell1-novncproxy-novncproxy/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.861019 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-575vl_39297e68-ef0c-4e52-922d-28805d4a7171/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:34 crc kubenswrapper[4922]: I0929 10:50:34.995272 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6e84e53c-d007-4780-be8e-1794d0c7b88f/nova-metadata-log/0.log" Sep 29 10:50:35 crc kubenswrapper[4922]: I0929 10:50:35.479479 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ee836f5f-3a1b-4c14-9234-711246af0b41/nova-scheduler-scheduler/0.log" Sep 29 10:50:35 crc kubenswrapper[4922]: I0929 10:50:35.662199 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/mysql-bootstrap/0.log" Sep 29 10:50:35 crc kubenswrapper[4922]: I0929 10:50:35.898440 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/mysql-bootstrap/0.log" Sep 29 10:50:35 crc kubenswrapper[4922]: I0929 10:50:35.934296 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7c255b1-65cb-42e0-b799-e3a735956220/galera/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.177151 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/mysql-bootstrap/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.367198 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/mysql-bootstrap/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.416196 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6e84e53c-d007-4780-be8e-1794d0c7b88f/nova-metadata-metadata/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.425092 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f2f851a-d7af-4580-8867-6865c5f1d4ce/galera/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.606256 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c6091d68-5a19-44af-8ffb-ec05b516a160/openstackclient/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.839280 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6kqsg_12d2ae39-f918-485b-a8c4-b083cdf9d48f/ovn-controller/0.log" Sep 29 10:50:36 crc kubenswrapper[4922]: I0929 10:50:36.912022 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cgtjg_cf954c93-5942-433b-bbb7-6f0737969eb5/openstack-network-exporter/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.149562 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server-init/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.496675 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server-init/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.558730 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovsdb-server/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.568409 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bzdcj_404af620-a2df-4414-acfc-b669e8518298/ovs-vswitchd/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.874407 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fz6hz_4e19dd22-3797-45d1-b2b2-3bc1a3d8b36e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:37 crc kubenswrapper[4922]: I0929 10:50:37.924509 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6b7e7da6-14cb-4046-b71d-8039326ca601/openstack-network-exporter/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.077582 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6b7e7da6-14cb-4046-b71d-8039326ca601/ovn-northd/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.153566 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d76a9416-91c8-4df6-b6a4-898c4df4ac1a/openstack-network-exporter/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.288325 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d76a9416-91c8-4df6-b6a4-898c4df4ac1a/ovsdbserver-nb/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.340149 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d267e81e-9044-4619-b2f2-4c370674a31c/openstack-network-exporter/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.498221 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d267e81e-9044-4619-b2f2-4c370674a31c/ovsdbserver-sb/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.656862 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fc9c79bdd-vqp6p_aceaf0a2-2b2b-4ef9-99d1-8bd21f553634/placement-api/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.827481 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7fc9c79bdd-vqp6p_aceaf0a2-2b2b-4ef9-99d1-8bd21f553634/placement-log/0.log" Sep 29 10:50:38 crc kubenswrapper[4922]: I0929 10:50:38.912239 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/setup-container/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.126295 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/rabbitmq/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.149204 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_82b45f77-ae02-47df-b1ab-5137f6e23089/setup-container/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.392082 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/setup-container/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.557801 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/rabbitmq/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.592103 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d2c1fe4-f762-40fa-8439-f74d3e234d30/setup-container/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.768257 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8jzsr_2bb4b88d-fc96-488b-a144-7f524d2cd1e7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:39 crc kubenswrapper[4922]: I0929 10:50:39.911437 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2rqds_782111a0-a54f-49fa-a519-e0d3a68e9cbf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.098371 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p7r8t_7d952f02-09db-44fd-ae8b-6b2c8ea06505/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.285069 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kfbbn_3ae2127d-25fd-4296-9143-1f12b7ffd0c2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.425286 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ntxlw_b579a838-93e1-47ac-8069-b49e76d8d630/ssh-known-hosts-edpm-deployment/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.639343 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f74f76895-9f28s_1b044ac1-a144-454a-a2f7-bf438ba13cc0/proxy-server/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.718325 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f74f76895-9f28s_1b044ac1-a144-454a-a2f7-bf438ba13cc0/proxy-httpd/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.842387 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tvgqs_396dcf64-c14b-4e56-9533-dbadbfac272a/swift-ring-rebalance/0.log" Sep 29 10:50:40 crc kubenswrapper[4922]: I0929 10:50:40.983588 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-auditor/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.068776 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-reaper/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.448699 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-server/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.501185 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/account-replicator/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.526281 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-auditor/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.636230 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-replicator/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.732543 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-server/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.737291 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/container-updater/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.907228 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-auditor/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.973170 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-replicator/0.log" Sep 29 10:50:41 crc kubenswrapper[4922]: I0929 10:50:41.975393 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-expirer/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.094927 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-server/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.220361 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/object-updater/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.245143 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/rsync/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.328252 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1107f41-4b1d-4531-91cc-329f8ba26bea/swift-recon-cron/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.490599 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dd76x_a810e32e-1655-40f8-b445-9922b0d5603f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.746978 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b34ceaf2-30f5-4be7-8806-fad8a2bd21ab/tempest-tests-tempest-tests-runner/0.log" Sep 29 10:50:42 crc kubenswrapper[4922]: I0929 10:50:42.791680 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_03f2e2e4-eb8b-4f50-9f46-5069c3c8a5df/test-operator-logs-container/0.log" Sep 29 10:50:43 crc kubenswrapper[4922]: I0929 10:50:43.028971 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9gtxn_6e5d0e82-5e2e-4395-9dcf-a3f6d856ee0f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 29 10:50:54 crc kubenswrapper[4922]: I0929 10:50:54.663158 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_617d5c0b-6b79-4ec2-a50e-bd4afa1d8e0c/memcached/0.log" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.224176 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:18 crc kubenswrapper[4922]: E0929 10:51:18.225498 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="extract-content" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.225514 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="extract-content" Sep 29 10:51:18 crc kubenswrapper[4922]: E0929 10:51:18.225526 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="extract-utilities" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.225534 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="extract-utilities" Sep 29 10:51:18 crc kubenswrapper[4922]: E0929 10:51:18.225555 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="registry-server" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.225562 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="registry-server" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.225808 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf04013f-1577-4f1b-a8ad-cf2a0151c317" containerName="registry-server" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.236861 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.258005 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.334824 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.334938 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.335104 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzjdl\" (UniqueName: \"kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.435994 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzjdl\" (UniqueName: \"kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.436100 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.436225 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.437206 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.437996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.463749 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzjdl\" (UniqueName: \"kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl\") pod \"redhat-operators-hwrrs\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:18 crc kubenswrapper[4922]: I0929 10:51:18.575294 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:19 crc kubenswrapper[4922]: I0929 10:51:19.589730 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:19 crc kubenswrapper[4922]: I0929 10:51:19.798394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerStarted","Data":"0281f534830a220d77cf23ec7fb962aa48dffde536023d53194fac02f3928a4d"} Sep 29 10:51:20 crc kubenswrapper[4922]: I0929 10:51:20.807318 4922 generic.go:334] "Generic (PLEG): container finished" podID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerID="d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a" exitCode=0 Sep 29 10:51:20 crc kubenswrapper[4922]: I0929 10:51:20.807511 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerDied","Data":"d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a"} Sep 29 10:51:21 crc kubenswrapper[4922]: I0929 10:51:21.818199 4922 generic.go:334] "Generic (PLEG): container finished" podID="dc909a37-c2d0-4885-8811-3661f036b134" containerID="e707507c3e1a01a7fb064a69947360ce4fbd48538f9df2e34588931d8c02b4f9" exitCode=0 Sep 29 10:51:21 crc kubenswrapper[4922]: I0929 10:51:21.818310 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-589gh" event={"ID":"dc909a37-c2d0-4885-8811-3661f036b134","Type":"ContainerDied","Data":"e707507c3e1a01a7fb064a69947360ce4fbd48538f9df2e34588931d8c02b4f9"} Sep 29 10:51:21 crc kubenswrapper[4922]: I0929 10:51:21.821472 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerStarted","Data":"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1"} Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.833014 4922 generic.go:334] "Generic (PLEG): container finished" podID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerID="85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1" exitCode=0 Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.833107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerDied","Data":"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1"} Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.925755 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.955140 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85kl9\" (UniqueName: \"kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9\") pod \"dc909a37-c2d0-4885-8811-3661f036b134\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.955429 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host\") pod \"dc909a37-c2d0-4885-8811-3661f036b134\" (UID: \"dc909a37-c2d0-4885-8811-3661f036b134\") " Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.955561 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host" (OuterVolumeSpecName: "host") pod "dc909a37-c2d0-4885-8811-3661f036b134" (UID: "dc909a37-c2d0-4885-8811-3661f036b134"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.955783 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-589gh"] Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.956095 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc909a37-c2d0-4885-8811-3661f036b134-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.962355 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9" (OuterVolumeSpecName: "kube-api-access-85kl9") pod "dc909a37-c2d0-4885-8811-3661f036b134" (UID: "dc909a37-c2d0-4885-8811-3661f036b134"). InnerVolumeSpecName "kube-api-access-85kl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:51:22 crc kubenswrapper[4922]: I0929 10:51:22.962636 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-589gh"] Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.058093 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85kl9\" (UniqueName: \"kubernetes.io/projected/dc909a37-c2d0-4885-8811-3661f036b134-kube-api-access-85kl9\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.462603 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc909a37-c2d0-4885-8811-3661f036b134" path="/var/lib/kubelet/pods/dc909a37-c2d0-4885-8811-3661f036b134/volumes" Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.842600 4922 scope.go:117] "RemoveContainer" containerID="e707507c3e1a01a7fb064a69947360ce4fbd48538f9df2e34588931d8c02b4f9" Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.842656 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-589gh" Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.846792 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerStarted","Data":"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c"} Sep 29 10:51:23 crc kubenswrapper[4922]: I0929 10:51:23.874974 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hwrrs" podStartSLOduration=3.202068153 podStartE2EDuration="5.87495229s" podCreationTimestamp="2025-09-29 10:51:18 +0000 UTC" firstStartedPulling="2025-09-29 10:51:20.809485251 +0000 UTC m=+4006.175715515" lastFinishedPulling="2025-09-29 10:51:23.482369388 +0000 UTC m=+4008.848599652" observedRunningTime="2025-09-29 10:51:23.868703226 +0000 UTC m=+4009.234933500" watchObservedRunningTime="2025-09-29 10:51:23.87495229 +0000 UTC m=+4009.241182554" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.149478 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrncb/crc-debug-9sbz9"] Sep 29 10:51:24 crc kubenswrapper[4922]: E0929 10:51:24.150599 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc909a37-c2d0-4885-8811-3661f036b134" containerName="container-00" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.150628 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc909a37-c2d0-4885-8811-3661f036b134" containerName="container-00" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.151323 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc909a37-c2d0-4885-8811-3661f036b134" containerName="container-00" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.152616 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.178811 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrh77\" (UniqueName: \"kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.178932 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.281692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrh77\" (UniqueName: \"kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.281826 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.281969 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.304575 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrh77\" (UniqueName: \"kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77\") pod \"crc-debug-9sbz9\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.471203 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:24 crc kubenswrapper[4922]: W0929 10:51:24.504156 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22ea5f9_8c0b_49ef_96bb_ac7692816ecd.slice/crio-2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531 WatchSource:0}: Error finding container 2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531: Status 404 returned error can't find the container with id 2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531 Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.862088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" event={"ID":"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd","Type":"ContainerStarted","Data":"f158006e8111a65d1119489e578a03d8a8b426e90e5a63e45bff544d8652062a"} Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.862133 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" event={"ID":"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd","Type":"ContainerStarted","Data":"2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531"} Sep 29 10:51:24 crc kubenswrapper[4922]: I0929 10:51:24.884205 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" podStartSLOduration=0.884182747 podStartE2EDuration="884.182747ms" podCreationTimestamp="2025-09-29 10:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 10:51:24.877241617 +0000 UTC m=+4010.243471881" watchObservedRunningTime="2025-09-29 10:51:24.884182747 +0000 UTC m=+4010.250413011" Sep 29 10:51:25 crc kubenswrapper[4922]: I0929 10:51:25.885934 4922 generic.go:334] "Generic (PLEG): container finished" podID="e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" containerID="f158006e8111a65d1119489e578a03d8a8b426e90e5a63e45bff544d8652062a" exitCode=0 Sep 29 10:51:25 crc kubenswrapper[4922]: I0929 10:51:25.886031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" event={"ID":"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd","Type":"ContainerDied","Data":"f158006e8111a65d1119489e578a03d8a8b426e90e5a63e45bff544d8652062a"} Sep 29 10:51:26 crc kubenswrapper[4922]: I0929 10:51:26.999888 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.026959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host\") pod \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.027033 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host" (OuterVolumeSpecName: "host") pod "e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" (UID: "e22ea5f9-8c0b-49ef-96bb-ac7692816ecd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.027183 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrh77\" (UniqueName: \"kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77\") pod \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\" (UID: \"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd\") " Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.027811 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.035255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77" (OuterVolumeSpecName: "kube-api-access-nrh77") pod "e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" (UID: "e22ea5f9-8c0b-49ef-96bb-ac7692816ecd"). InnerVolumeSpecName "kube-api-access-nrh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.129131 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrh77\" (UniqueName: \"kubernetes.io/projected/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd-kube-api-access-nrh77\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.905326 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" event={"ID":"e22ea5f9-8c0b-49ef-96bb-ac7692816ecd","Type":"ContainerDied","Data":"2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531"} Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.905388 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d253e24b62eac3c8d4807819d4d44a2daf885f6809973a2f46f7adfbb7f4531" Sep 29 10:51:27 crc kubenswrapper[4922]: I0929 10:51:27.905674 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-9sbz9" Sep 29 10:51:28 crc kubenswrapper[4922]: I0929 10:51:28.575377 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:28 crc kubenswrapper[4922]: I0929 10:51:28.575715 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:28 crc kubenswrapper[4922]: I0929 10:51:28.634561 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:28 crc kubenswrapper[4922]: I0929 10:51:28.963504 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:29 crc kubenswrapper[4922]: I0929 10:51:29.007695 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:29 crc kubenswrapper[4922]: I0929 10:51:29.070101 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:51:29 crc kubenswrapper[4922]: I0929 10:51:29.070153 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:51:30 crc kubenswrapper[4922]: I0929 10:51:30.933216 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hwrrs" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="registry-server" containerID="cri-o://a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c" gracePeriod=2 Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.384359 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.505035 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities\") pod \"cbc35896-caee-4ef1-a04e-7a52c58aee85\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.505116 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzjdl\" (UniqueName: \"kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl\") pod \"cbc35896-caee-4ef1-a04e-7a52c58aee85\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.505203 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content\") pod \"cbc35896-caee-4ef1-a04e-7a52c58aee85\" (UID: \"cbc35896-caee-4ef1-a04e-7a52c58aee85\") " Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.506417 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities" (OuterVolumeSpecName: "utilities") pod "cbc35896-caee-4ef1-a04e-7a52c58aee85" (UID: "cbc35896-caee-4ef1-a04e-7a52c58aee85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.512207 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl" (OuterVolumeSpecName: "kube-api-access-lzjdl") pod "cbc35896-caee-4ef1-a04e-7a52c58aee85" (UID: "cbc35896-caee-4ef1-a04e-7a52c58aee85"). InnerVolumeSpecName "kube-api-access-lzjdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.592476 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-9sbz9"] Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.597426 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbc35896-caee-4ef1-a04e-7a52c58aee85" (UID: "cbc35896-caee-4ef1-a04e-7a52c58aee85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.601047 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-9sbz9"] Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.608361 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.608403 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzjdl\" (UniqueName: \"kubernetes.io/projected/cbc35896-caee-4ef1-a04e-7a52c58aee85-kube-api-access-lzjdl\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.608413 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc35896-caee-4ef1-a04e-7a52c58aee85-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.942450 4922 generic.go:334] "Generic (PLEG): container finished" podID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerID="a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c" exitCode=0 Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.942520 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrrs" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.942541 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerDied","Data":"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c"} Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.943578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrrs" event={"ID":"cbc35896-caee-4ef1-a04e-7a52c58aee85","Type":"ContainerDied","Data":"0281f534830a220d77cf23ec7fb962aa48dffde536023d53194fac02f3928a4d"} Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.943600 4922 scope.go:117] "RemoveContainer" containerID="a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.972696 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.976599 4922 scope.go:117] "RemoveContainer" containerID="85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1" Sep 29 10:51:31 crc kubenswrapper[4922]: I0929 10:51:31.979753 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hwrrs"] Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.018004 4922 scope.go:117] "RemoveContainer" containerID="d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.042653 4922 scope.go:117] "RemoveContainer" containerID="a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.043224 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c\": container with ID starting with a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c not found: ID does not exist" containerID="a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.043274 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c"} err="failed to get container status \"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c\": rpc error: code = NotFound desc = could not find container \"a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c\": container with ID starting with a26b23edfd7693340a876f4077092a0303390b308d41da63fd0cbf6ca2b8bf0c not found: ID does not exist" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.043301 4922 scope.go:117] "RemoveContainer" containerID="85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.043560 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1\": container with ID starting with 85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1 not found: ID does not exist" containerID="85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.043594 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1"} err="failed to get container status \"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1\": rpc error: code = NotFound desc = could not find container \"85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1\": container with ID starting with 85000554a854ddd64f5fa64b8ebfc236233d8e803977f170e31aef15490a7dd1 not found: ID does not exist" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.043620 4922 scope.go:117] "RemoveContainer" containerID="d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.043985 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a\": container with ID starting with d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a not found: ID does not exist" containerID="d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.044052 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a"} err="failed to get container status \"d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a\": rpc error: code = NotFound desc = could not find container \"d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a\": container with ID starting with d07080e4d5a46a025b84052090124ffc9a341dcb2e1e9879a7aa83adfcb2ca0a not found: ID does not exist" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766114 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrncb/crc-debug-w9zd2"] Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.766528 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="registry-server" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766545 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="registry-server" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.766564 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" containerName="container-00" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766571 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" containerName="container-00" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.766582 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="extract-utilities" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766588 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="extract-utilities" Sep 29 10:51:32 crc kubenswrapper[4922]: E0929 10:51:32.766622 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="extract-content" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766629 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="extract-content" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766801 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" containerName="registry-server" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.766828 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" containerName="container-00" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.767472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.831967 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjk8\" (UniqueName: \"kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.832051 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.934959 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdjk8\" (UniqueName: \"kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.935046 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:32 crc kubenswrapper[4922]: I0929 10:51:32.935296 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.086224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdjk8\" (UniqueName: \"kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8\") pod \"crc-debug-w9zd2\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.092865 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.469565 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc35896-caee-4ef1-a04e-7a52c58aee85" path="/var/lib/kubelet/pods/cbc35896-caee-4ef1-a04e-7a52c58aee85/volumes" Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.471448 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22ea5f9-8c0b-49ef-96bb-ac7692816ecd" path="/var/lib/kubelet/pods/e22ea5f9-8c0b-49ef-96bb-ac7692816ecd/volumes" Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.964883 4922 generic.go:334] "Generic (PLEG): container finished" podID="712d73bc-c2f2-4acc-b52b-90fc0a9e7579" containerID="caa8c7a757c8f7024031f0288dfa13a812d3edb337753b3316a77dc9ff00a49c" exitCode=0 Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.964946 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" event={"ID":"712d73bc-c2f2-4acc-b52b-90fc0a9e7579","Type":"ContainerDied","Data":"caa8c7a757c8f7024031f0288dfa13a812d3edb337753b3316a77dc9ff00a49c"} Sep 29 10:51:33 crc kubenswrapper[4922]: I0929 10:51:33.965281 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" event={"ID":"712d73bc-c2f2-4acc-b52b-90fc0a9e7579","Type":"ContainerStarted","Data":"a9c3ba9b5336b88f0a72e6b471e51602f7928faf5e074c7a419da20363bf73be"} Sep 29 10:51:34 crc kubenswrapper[4922]: I0929 10:51:34.008916 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-w9zd2"] Sep 29 10:51:34 crc kubenswrapper[4922]: I0929 10:51:34.020265 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrncb/crc-debug-w9zd2"] Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.099589 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.177251 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host\") pod \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.177397 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host" (OuterVolumeSpecName: "host") pod "712d73bc-c2f2-4acc-b52b-90fc0a9e7579" (UID: "712d73bc-c2f2-4acc-b52b-90fc0a9e7579"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.177506 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdjk8\" (UniqueName: \"kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8\") pod \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\" (UID: \"712d73bc-c2f2-4acc-b52b-90fc0a9e7579\") " Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.178566 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-host\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.183163 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8" (OuterVolumeSpecName: "kube-api-access-xdjk8") pod "712d73bc-c2f2-4acc-b52b-90fc0a9e7579" (UID: "712d73bc-c2f2-4acc-b52b-90fc0a9e7579"). InnerVolumeSpecName "kube-api-access-xdjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.280910 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdjk8\" (UniqueName: \"kubernetes.io/projected/712d73bc-c2f2-4acc-b52b-90fc0a9e7579-kube-api-access-xdjk8\") on node \"crc\" DevicePath \"\"" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.416729 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-wxkn7_733a9696-fd92-42d4-b4df-6e4ba3d9d433/kube-rbac-proxy/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.473899 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712d73bc-c2f2-4acc-b52b-90fc0a9e7579" path="/var/lib/kubelet/pods/712d73bc-c2f2-4acc-b52b-90fc0a9e7579/volumes" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.484322 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6495d75b5-wxkn7_733a9696-fd92-42d4-b4df-6e4ba3d9d433/manager/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.638470 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-8vhzj_05dca5c7-0856-4c86-9bf8-99c6edc07252/kube-rbac-proxy/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.674667 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748c574d75-8vhzj_05dca5c7-0856-4c86-9bf8-99c6edc07252/manager/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.773734 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-45kw4_2536f9c0-aac9-4d2c-be19-8afe9ac2e418/kube-rbac-proxy/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.822396 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d74f4d695-45kw4_2536f9c0-aac9-4d2c-be19-8afe9ac2e418/manager/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.863305 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.988346 4922 scope.go:117] "RemoveContainer" containerID="caa8c7a757c8f7024031f0288dfa13a812d3edb337753b3316a77dc9ff00a49c" Sep 29 10:51:35 crc kubenswrapper[4922]: I0929 10:51:35.988449 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/crc-debug-w9zd2" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.053571 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.053893 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.094472 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.207727 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/util/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.222939 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/pull/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.255046 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb8b04c3cc17dd3f4dae1f40a7bd910b85e290f2ee7ecc02a1ec3b79abb6xrl_46ce3f41-6af5-42e1-9712-ad73b0089ad9/extract/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.372558 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-ph6mk_0ed6eee8-7938-4f36-98f8-99af2cc40a4e/kube-rbac-proxy/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.437261 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-jhr5q_88c2443d-e9bf-441b-ae76-93b7f63c790b/kube-rbac-proxy/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.475024 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67b5d44b7f-ph6mk_0ed6eee8-7938-4f36-98f8-99af2cc40a4e/manager/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.565904 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8ff95898-jhr5q_88c2443d-e9bf-441b-ae76-93b7f63c790b/manager/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.618865 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-ft79c_aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0/kube-rbac-proxy/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.697120 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-695847bc78-ft79c_aab8725e-fdd4-46bc-9d3d-daf8fdf4e8a0/manager/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.820279 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-8pnnw_698d9305-27b8-44fe-bcd9-f034bdfa9b09/kube-rbac-proxy/0.log" Sep 29 10:51:36 crc kubenswrapper[4922]: I0929 10:51:36.999104 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-jbs5x_fe4d01cb-1457-4cca-b2ed-7da6250a47df/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.011391 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-858cd69f49-8pnnw_698d9305-27b8-44fe-bcd9-f034bdfa9b09/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.042682 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9fc8d5567-jbs5x_fe4d01cb-1457-4cca-b2ed-7da6250a47df/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.191988 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-xns9g_39c6dedb-23e2-4515-83c8-1e85e0136cc8/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.229475 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7bf498966c-xns9g_39c6dedb-23e2-4515-83c8-1e85e0136cc8/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.264925 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-r8wxg_c9221095-3450-45f9-9aa2-e4994c8471ef/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.359469 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56cf9c6b99-r8wxg_c9221095-3450-45f9-9aa2-e4994c8471ef/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.413897 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-b69kd_9c51299d-7ce3-4dff-b555-8cc2bcee6e4c/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.504481 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-687b9cf756-b69kd_9c51299d-7ce3-4dff-b555-8cc2bcee6e4c/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.598930 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-r4lcr_8ad2a8b0-1e70-47e2-80a1-139eedb15541/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.683454 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54d766c9f9-r4lcr_8ad2a8b0-1e70-47e2-80a1-139eedb15541/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.796065 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-knzwz_a01ec1f8-817f-4ed8-9431-01847d4956be/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.912629 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-knzwz_a01ec1f8-817f-4ed8-9431-01847d4956be/manager/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.957181 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-5wnjj_60630351-afcc-4792-bb16-5994368117cd/kube-rbac-proxy/0.log" Sep 29 10:51:37 crc kubenswrapper[4922]: I0929 10:51:37.990642 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-5wnjj_60630351-afcc-4792-bb16-5994368117cd/manager/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.126118 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-p2hsw_2bbc35ab-8adc-445e-bc17-690ce9533a3e/manager/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.143699 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-p2hsw_2bbc35ab-8adc-445e-bc17-690ce9533a3e/kube-rbac-proxy/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.302435 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8d8dc476c-zjv29_00b67606-b7e3-4043-abff-cae6f14ba095/kube-rbac-proxy/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.376856 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7484b66f-slfdl_77c866e5-8ec4-47ac-809c-0fc002c47957/kube-rbac-proxy/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.616910 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-8r27w_822c4c9d-3c4c-43db-a891-19b9db1d279b/registry-server/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.663372 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7484b66f-slfdl_77c866e5-8ec4-47ac-809c-0fc002c47957/operator/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.814971 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-rvj4d_c4ba5f8a-ca61-4870-bc8e-017e79e139a5/kube-rbac-proxy/0.log" Sep 29 10:51:38 crc kubenswrapper[4922]: I0929 10:51:38.868864 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5f95c46c78-rvj4d_c4ba5f8a-ca61-4870-bc8e-017e79e139a5/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.057473 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-pnns5_c38d04c4-b717-4155-b646-b06c3dac3386/kube-rbac-proxy/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.086898 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-774b97b48-pnns5_c38d04c4-b717-4155-b646-b06c3dac3386/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.208925 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-lspp8_c91f6c11-bc07-4bdb-bfc5-4480e8dff8a3/operator/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.327461 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-82g7r_71aa3678-e2c4-4a23-9e66-738fddb6066f/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.341375 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-82g7r_71aa3678-e2c4-4a23-9e66-738fddb6066f/kube-rbac-proxy/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.425929 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8d8dc476c-zjv29_00b67606-b7e3-4043-abff-cae6f14ba095/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.456291 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-5s5wf_f4fbefa3-c5d4-4a51-b90a-512ebfcef863/kube-rbac-proxy/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.579273 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5bf96cfbc4-5s5wf_f4fbefa3-c5d4-4a51-b90a-512ebfcef863/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.595765 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-lfh4h_97df5e99-5243-4552-ab72-7c6526deea11/kube-rbac-proxy/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.651861 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-lfh4h_97df5e99-5243-4552-ab72-7c6526deea11/manager/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.750050 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-lcd9t_0d5d466f-e41b-42c4-91ff-11e84d297b5d/kube-rbac-proxy/0.log" Sep 29 10:51:39 crc kubenswrapper[4922]: I0929 10:51:39.772936 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-lcd9t_0d5d466f-e41b-42c4-91ff-11e84d297b5d/manager/0.log" Sep 29 10:51:52 crc kubenswrapper[4922]: I0929 10:51:52.607502 4922 scope.go:117] "RemoveContainer" containerID="40d974c544859195cccc1add38e6f3aedc77f08a877fc4cfe6339b5db460abd9" Sep 29 10:51:54 crc kubenswrapper[4922]: I0929 10:51:54.994388 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cftkt_b2ea2f47-4732-47bd-9099-c503b5610f43/control-plane-machine-set-operator/0.log" Sep 29 10:51:55 crc kubenswrapper[4922]: I0929 10:51:55.155079 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dbc7l_dea0217e-c923-4045-9b4f-90a9eff30f93/kube-rbac-proxy/0.log" Sep 29 10:51:55 crc kubenswrapper[4922]: I0929 10:51:55.183706 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dbc7l_dea0217e-c923-4045-9b4f-90a9eff30f93/machine-api-operator/0.log" Sep 29 10:51:59 crc kubenswrapper[4922]: I0929 10:51:59.070990 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:51:59 crc kubenswrapper[4922]: I0929 10:51:59.071644 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:52:05 crc kubenswrapper[4922]: I0929 10:52:05.362187 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-snmmn_7626bb44-b67d-42eb-b912-a9b279f7157d/cert-manager-controller/0.log" Sep 29 10:52:05 crc kubenswrapper[4922]: I0929 10:52:05.538040 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q9z92_c9cbdf04-ea9d-4663-94c9-345fb63f3f9c/cert-manager-cainjector/0.log" Sep 29 10:52:05 crc kubenswrapper[4922]: I0929 10:52:05.567056 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-x5g9r_93e3a2dc-f64b-4766-b851-2faa2e57c4f4/cert-manager-webhook/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.355736 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-gc2pq_bcb21dd2-5cc5-49e5-a2df-e47c1a29dc72/nmstate-console-plugin/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.530999 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sp9q9_bf8e5bd0-e08e-4818-843b-30f7c956626f/nmstate-handler/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.568821 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5cfzj_8fbc50c8-5afc-4ad5-888b-167e84fa22d0/kube-rbac-proxy/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.588253 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5cfzj_8fbc50c8-5afc-4ad5-888b-167e84fa22d0/nmstate-metrics/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.770800 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-99cmf_430f697e-6b89-4db1-91a8-194c8a7af724/nmstate-webhook/0.log" Sep 29 10:52:16 crc kubenswrapper[4922]: I0929 10:52:16.777317 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-hrqwq_485426b6-cad6-4591-beaf-d8bb33f79ea1/nmstate-operator/0.log" Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.070734 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.071363 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.071418 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.072220 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.072286 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91" gracePeriod=600 Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.471341 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91" exitCode=0 Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.471419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91"} Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.472112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerStarted","Data":"5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4"} Sep 29 10:52:29 crc kubenswrapper[4922]: I0929 10:52:29.472142 4922 scope.go:117] "RemoveContainer" containerID="5817c30f621249cdb1d2b88f093a75652b4079f8a88e811e5d374aa927e11401" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.063896 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-vz5n2_7732c258-d416-45bd-92a4-1a852c9bf4e6/kube-rbac-proxy/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.239028 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-vz5n2_7732c258-d416-45bd-92a4-1a852c9bf4e6/controller/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.294171 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.434077 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.462499 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.496421 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.511458 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.710286 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.726270 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.729484 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.751422 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.876817 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-metrics/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.897012 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-frr-files/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.907442 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/cp-reloader/0.log" Sep 29 10:52:31 crc kubenswrapper[4922]: I0929 10:52:31.941803 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/controller/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.118959 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/frr-metrics/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.138938 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/kube-rbac-proxy/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.167679 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/kube-rbac-proxy-frr/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.348188 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/reloader/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.427879 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-p9x6q_fe4bf7c5-f4cf-4c2c-9075-14c39b06297d/frr-k8s-webhook-server/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.570583 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6bfbf6b8fd-w44ps_74df0c3b-e3ed-4061-9fb2-a9a830974755/manager/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.712026 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-654f6f79d6-9gbmz_2d2222f4-496b-4cbf-883c-e3ac89e08a79/webhook-server/0.log" Sep 29 10:52:32 crc kubenswrapper[4922]: I0929 10:52:32.869989 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jp26n_3b15f008-2077-4246-af46-d39384412fa5/kube-rbac-proxy/0.log" Sep 29 10:52:33 crc kubenswrapper[4922]: I0929 10:52:33.383277 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jp26n_3b15f008-2077-4246-af46-d39384412fa5/speaker/0.log" Sep 29 10:52:33 crc kubenswrapper[4922]: I0929 10:52:33.573310 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54pq5_41460288-0fe6-4f0f-ba8d-121ee673bf0d/frr/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.594898 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.715848 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.745252 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.750820 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.949611 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/util/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.950318 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/pull/0.log" Sep 29 10:52:45 crc kubenswrapper[4922]: I0929 10:52:45.968556 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc789fm_814fb3a2-10f7-4136-8745-0caf3cc5dac8/extract/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.127584 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.509627 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.539340 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.551384 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.689585 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-content/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.717087 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/extract-utilities/0.log" Sep 29 10:52:46 crc kubenswrapper[4922]: I0929 10:52:46.888114 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.170974 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.208860 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.209219 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.330775 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2dckj_c6981ca2-b9f8-4907-91f6-c470174f2d9e/registry-server/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.376019 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-utilities/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.404700 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/extract-content/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.598733 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.823121 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.871733 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:52:47 crc kubenswrapper[4922]: I0929 10:52:47.908663 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.080100 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ghrvd_70c79985-eb4e-4c5a-a463-356e7c217ed0/registry-server/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.145274 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/extract/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.169774 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/pull/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.171613 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96ff2x9_abce2942-2d7c-4097-992b-3ca6aabdc6f1/util/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.367631 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fkw4t_69498aa6-9b16-42bd-97f7-f3f52b763788/marketplace-operator/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.372060 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.555097 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.561031 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.561663 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.732820 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-utilities/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.781811 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/extract-content/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.828029 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:52:48 crc kubenswrapper[4922]: I0929 10:52:48.916852 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xd6n6_fbde2060-f54a-448a-a391-dbb6f2cf95e8/registry-server/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.027406 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.031195 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.056293 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.221991 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-content/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.225171 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/extract-utilities/0.log" Sep 29 10:52:49 crc kubenswrapper[4922]: I0929 10:52:49.784051 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5p6k_42bd8ff0-fc46-401b-941c-26b4a171ba7d/registry-server/0.log" Sep 29 10:54:29 crc kubenswrapper[4922]: I0929 10:54:29.070368 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:54:29 crc kubenswrapper[4922]: I0929 10:54:29.070954 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:54:44 crc kubenswrapper[4922]: I0929 10:54:44.778605 4922 generic.go:334] "Generic (PLEG): container finished" podID="079038bc-a923-488e-9c8c-a2729ab2c150" containerID="8555344e13e12465d043e8f273f97cba4588017c6e374a1b1cee2e75a167959a" exitCode=0 Sep 29 10:54:44 crc kubenswrapper[4922]: I0929 10:54:44.778686 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrncb/must-gather-n9z8d" event={"ID":"079038bc-a923-488e-9c8c-a2729ab2c150","Type":"ContainerDied","Data":"8555344e13e12465d043e8f273f97cba4588017c6e374a1b1cee2e75a167959a"} Sep 29 10:54:44 crc kubenswrapper[4922]: I0929 10:54:44.780248 4922 scope.go:117] "RemoveContainer" containerID="8555344e13e12465d043e8f273f97cba4588017c6e374a1b1cee2e75a167959a" Sep 29 10:54:45 crc kubenswrapper[4922]: I0929 10:54:45.167121 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrncb_must-gather-n9z8d_079038bc-a923-488e-9c8c-a2729ab2c150/gather/0.log" Sep 29 10:54:51 crc kubenswrapper[4922]: E0929 10:54:51.160321 4922 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.111:49774->38.129.56.111:45175: write tcp 38.129.56.111:49774->38.129.56.111:45175: write: broken pipe Sep 29 10:54:56 crc kubenswrapper[4922]: I0929 10:54:56.756241 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrncb/must-gather-n9z8d"] Sep 29 10:54:56 crc kubenswrapper[4922]: I0929 10:54:56.757089 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zrncb/must-gather-n9z8d" podUID="079038bc-a923-488e-9c8c-a2729ab2c150" containerName="copy" containerID="cri-o://8d28d51cf00ce452830db9e119bd1771724c2732550982821a57fcf993b02e51" gracePeriod=2 Sep 29 10:54:56 crc kubenswrapper[4922]: I0929 10:54:56.767511 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrncb/must-gather-n9z8d"] Sep 29 10:54:56 crc kubenswrapper[4922]: I0929 10:54:56.883689 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrncb_must-gather-n9z8d_079038bc-a923-488e-9c8c-a2729ab2c150/copy/0.log" Sep 29 10:54:56 crc kubenswrapper[4922]: I0929 10:54:56.884339 4922 generic.go:334] "Generic (PLEG): container finished" podID="079038bc-a923-488e-9c8c-a2729ab2c150" containerID="8d28d51cf00ce452830db9e119bd1771724c2732550982821a57fcf993b02e51" exitCode=143 Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.236581 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrncb_must-gather-n9z8d_079038bc-a923-488e-9c8c-a2729ab2c150/copy/0.log" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.237622 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.271044 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output\") pod \"079038bc-a923-488e-9c8c-a2729ab2c150\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.271240 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hgh2\" (UniqueName: \"kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2\") pod \"079038bc-a923-488e-9c8c-a2729ab2c150\" (UID: \"079038bc-a923-488e-9c8c-a2729ab2c150\") " Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.279032 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2" (OuterVolumeSpecName: "kube-api-access-7hgh2") pod "079038bc-a923-488e-9c8c-a2729ab2c150" (UID: "079038bc-a923-488e-9c8c-a2729ab2c150"). InnerVolumeSpecName "kube-api-access-7hgh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.373250 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hgh2\" (UniqueName: \"kubernetes.io/projected/079038bc-a923-488e-9c8c-a2729ab2c150-kube-api-access-7hgh2\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.431194 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "079038bc-a923-488e-9c8c-a2729ab2c150" (UID: "079038bc-a923-488e-9c8c-a2729ab2c150"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.464377 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079038bc-a923-488e-9c8c-a2729ab2c150" path="/var/lib/kubelet/pods/079038bc-a923-488e-9c8c-a2729ab2c150/volumes" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.475403 4922 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/079038bc-a923-488e-9c8c-a2729ab2c150-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.893750 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrncb_must-gather-n9z8d_079038bc-a923-488e-9c8c-a2729ab2c150/copy/0.log" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.896434 4922 scope.go:117] "RemoveContainer" containerID="8d28d51cf00ce452830db9e119bd1771724c2732550982821a57fcf993b02e51" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.896602 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrncb/must-gather-n9z8d" Sep 29 10:54:57 crc kubenswrapper[4922]: I0929 10:54:57.921939 4922 scope.go:117] "RemoveContainer" containerID="8555344e13e12465d043e8f273f97cba4588017c6e374a1b1cee2e75a167959a" Sep 29 10:54:59 crc kubenswrapper[4922]: I0929 10:54:59.070689 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:54:59 crc kubenswrapper[4922]: I0929 10:54:59.071213 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:55:29 crc kubenswrapper[4922]: I0929 10:55:29.070533 4922 patch_prober.go:28] interesting pod/machine-config-daemon-kgzgq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 10:55:29 crc kubenswrapper[4922]: I0929 10:55:29.071132 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 10:55:29 crc kubenswrapper[4922]: I0929 10:55:29.071189 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" Sep 29 10:55:29 crc kubenswrapper[4922]: I0929 10:55:29.072173 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4"} pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 10:55:29 crc kubenswrapper[4922]: I0929 10:55:29.072223 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" containerName="machine-config-daemon" containerID="cri-o://5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" gracePeriod=600 Sep 29 10:55:29 crc kubenswrapper[4922]: E0929 10:55:29.201930 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:55:30 crc kubenswrapper[4922]: I0929 10:55:30.183227 4922 generic.go:334] "Generic (PLEG): container finished" podID="18583652-9871-4fba-93c8-9f86e9f57622" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" exitCode=0 Sep 29 10:55:30 crc kubenswrapper[4922]: I0929 10:55:30.183298 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" event={"ID":"18583652-9871-4fba-93c8-9f86e9f57622","Type":"ContainerDied","Data":"5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4"} Sep 29 10:55:30 crc kubenswrapper[4922]: I0929 10:55:30.183358 4922 scope.go:117] "RemoveContainer" containerID="7c213d1dc05d9221574da4502f4396900113bbadf8dec97284b7ec5120964a91" Sep 29 10:55:30 crc kubenswrapper[4922]: I0929 10:55:30.184295 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:55:30 crc kubenswrapper[4922]: E0929 10:55:30.184774 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:55:41 crc kubenswrapper[4922]: I0929 10:55:41.452258 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:55:41 crc kubenswrapper[4922]: E0929 10:55:41.452938 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:55:54 crc kubenswrapper[4922]: I0929 10:55:54.452514 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:55:54 crc kubenswrapper[4922]: E0929 10:55:54.453622 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:56:07 crc kubenswrapper[4922]: I0929 10:56:07.452092 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:56:07 crc kubenswrapper[4922]: E0929 10:56:07.453053 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:56:19 crc kubenswrapper[4922]: I0929 10:56:19.452628 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:56:19 crc kubenswrapper[4922]: E0929 10:56:19.453597 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:56:34 crc kubenswrapper[4922]: I0929 10:56:34.452170 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:56:34 crc kubenswrapper[4922]: E0929 10:56:34.453056 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622" Sep 29 10:56:47 crc kubenswrapper[4922]: I0929 10:56:47.452851 4922 scope.go:117] "RemoveContainer" containerID="5bf5239aa06cb9dab71a0506f36bfaf6b4b6ddf6ce5d10f782bae0ec6a0148f4" Sep 29 10:56:47 crc kubenswrapper[4922]: E0929 10:56:47.453777 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kgzgq_openshift-machine-config-operator(18583652-9871-4fba-93c8-9f86e9f57622)\"" pod="openshift-machine-config-operator/machine-config-daemon-kgzgq" podUID="18583652-9871-4fba-93c8-9f86e9f57622"